[WEB SECURITY] Crawl Form based authenticated website

ruby glow rubyglow.prideindia at gmail.com
Wed Jul 25 08:02:09 EDT 2012


Hi,

Thankyou to all for the probable solutions.

I cannot go for Proxy tools or any other as I am writing a web based
application where I can use some program running in backround only.

I tried wget with load-cookies option earlier , but it fails in many
cases as cookie value to not appropriately taken and some times there
are many hidden input fileds which are no known to general user. So
instead of this, I tried with

wget --cookies=off --header="Cookie:name=value"
where I expect the end user to have a valid session cookie and pass
this to my application which will frame the above command in the
background and run.

But my application's main objective is to simplify the task to end
user. I don't want him to open up cookie manager, get the valid cookie
and pass to me app etc. sometimes he may miss other cookies like
connection:keep-alive etc and some apps have unique session ids
generated eash time.

All of these will create problem in my app at the end.

So I decided to capture the request and response of a specific URL and
use them to frame the above mentioned wget command.

So now, my challenge is to
1. have a proxy server runnig in background
2. open up a browser window to end user to open the target website
3. Allow end user to login and do all his processing
4. while performing above by end user, my proxy in the backend should
capture all these requests and response.

Till now, I am not clear of how I would do the above, but thiking of
iframes to be utlized for this.

Please suggest a good way to do this.

Thanks to Chintan, Paulus,

Regards
Ruby


On 7/25/12, Chintan Dave <davechintan at gmail.com> wrote:
> Hi Ruby,
>
> I agree, this is a challenge with most web 2.0 apps. This is mainly because
> crawling web 2.0 apps is not easy.
>
> Following are 2 approaches, which I can think on the fly that you can use
>
> My 2 cents:
> ==========
>
> 1. Continue with wget approach, use switches --http-user = *<<username>>*
> *&
> *--http-password =*<<password>>* switches in your script. This will help
> you in authenticating and you can use grep or equivalent for getting the
> cookie value. In normal web app, this should work, not sure how successful
> it will be in web 2.0 based apps though. However, if this approach fails,
> then I think the challenge mainly is because of crawling.
>
> 2. In case of failure, I believe you might want to crawl the app
> efficiently first in order to ensure that you cover all URL's. To my
> understanding, Gecko or webkit engines could be used. I think Gecko does a
> fairly good job and is used by Firefox and webkit on the other end is used
> by Chrome.
>
> As mentioned by Paul, best approach may be manually logging in and feeding
> the cookie, however may not work in your case as I am guessing you are
> developing a CLI based tool, if its a GUI, probably you can let the user
> invoke the browser and login and your program can take it forward post
> successful authentication - an approach most scanners follow these days :)
>
> I think a good research on how to use these engines should help you.
> Let me know if this works for you.
>
> Thanks,
> Chintan
>
> On Wed, Jul 25, 2012 at 4:10 PM, Paulus Junior Lazuardi <
> paulusjrlz at gmail.com> wrote:
>
>> Hi,
>>
>> Apache JMeter perhaps?
>>
>> Junior
>>
>> On Tue, Jul 24, 2012 at 6:04 PM, Paul Johnston <
>> paul.johnston at pentest.co.uk> wrote:
>>
>>> Hi Ruby,
>>>
>>> The usual approach is to manually login to the site, then extract the
>>> cookie from your browser's cookie store. You can then pass this to wget
>>> using the --load-cookies option.
>>>
>>> Some software does try to automate this, and some even does it
>>> reasonably reliably, but login processes vary so much in the wild, it is
>>> difficult to do.
>>>
>>> Paul
>>>
>>>
>>> > I have used wget to perform spidering. it was working fine. but now i
>>> > am struck up at crawling a website which is authenticated. The tester
>>> > has the login credentials but how do I pass these credentials to the
>>> > wget is the issue here. I have seen that wget has a way to creating
>>> > cookies and using them for further spidering, but thats not working
>>> > for all kinds of web applications. Sometimes its able to store the
>>> > cookie values, but sometimes it fails. Most of the time, I am seeing
>>> > that the application is failed when it is webservices based.
>>>
>>> --
>>> Pentest - When a tick in the box is not enough
>>>
>>> Paul Johnston - IT Security Consultant / Tiger SST
>>> Pentest Limited - ISO 9001 (cert 16055) / ISO 27001 (cert 558982)
>>>
>>> Office: +44 (0) 161 233 0100
>>> Mobile: +44 (0) 7817 219 072
>>>
>>> Email policy: http://www.pentest.co.uk/legal.shtml#emailpolicy
>>> Registered Number: 4217114 England & Wales
>>> Registered Office: 26a The Downs, Altrincham, Cheshire, WA14 2PU, UK
>>>
>>>
>>> _______________________________________________
>>> The Web Security Mailing List
>>>
>>> WebSecurity RSS Feed
>>> http://www.webappsec.org/rss/websecurity.rss
>>>
>>> Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA
>>>
>>> WASC on Twitter
>>> http://twitter.com/wascupdates
>>>
>>> websecurity at lists.webappsec.org
>>>
>>> http://lists.webappsec.org/mailman/listinfo/websecurity_lists.webappsec.org
>>>
>>
>>
>>
>> --
>> Look and smile to the world and let the world smile to you
>>
>> _______________________________________________
>> The Web Security Mailing List
>>
>> WebSecurity RSS Feed
>> http://www.webappsec.org/rss/websecurity.rss
>>
>> Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA
>>
>> WASC on Twitter
>> http://twitter.com/wascupdates
>>
>> websecurity at lists.webappsec.org
>> http://lists.webappsec.org/mailman/listinfo/websecurity_lists.webappsec.org
>>
>>
>
>
> --
> Regards,
> Chintan Dave,
>
> LinkedIn: http://in.linkedin.com/in/chintandave
> Blog:http://www.chintandave.com
>




More information about the websecurity mailing list