[WEB SECURITY] Crawl Form based authenticated website
davechintan at gmail.com
Wed Jul 25 07:16:33 EDT 2012
I agree, this is a challenge with most web 2.0 apps. This is mainly because
crawling web 2.0 apps is not easy.
Following are 2 approaches, which I can think on the fly that you can use
My 2 cents:
1. Continue with wget approach, use switches --http-user = *<<username>>* *&
*--http-password =*<<password>>* switches in your script. This will help
you in authenticating and you can use grep or equivalent for getting the
cookie value. In normal web app, this should work, not sure how successful
it will be in web 2.0 based apps though. However, if this approach fails,
then I think the challenge mainly is because of crawling.
2. In case of failure, I believe you might want to crawl the app
efficiently first in order to ensure that you cover all URL's. To my
understanding, Gecko or webkit engines could be used. I think Gecko does a
fairly good job and is used by Firefox and webkit on the other end is used
As mentioned by Paul, best approach may be manually logging in and feeding
the cookie, however may not work in your case as I am guessing you are
developing a CLI based tool, if its a GUI, probably you can let the user
invoke the browser and login and your program can take it forward post
successful authentication - an approach most scanners follow these days :)
I think a good research on how to use these engines should help you.
Let me know if this works for you.
On Wed, Jul 25, 2012 at 4:10 PM, Paulus Junior Lazuardi <
paulusjrlz at gmail.com> wrote:
> Apache JMeter perhaps?
> On Tue, Jul 24, 2012 at 6:04 PM, Paul Johnston <
> paul.johnston at pentest.co.uk> wrote:
>> Hi Ruby,
>> The usual approach is to manually login to the site, then extract the
>> cookie from your browser's cookie store. You can then pass this to wget
>> using the --load-cookies option.
>> Some software does try to automate this, and some even does it
>> reasonably reliably, but login processes vary so much in the wild, it is
>> difficult to do.
>> > I have used wget to perform spidering. it was working fine. but now i
>> > am struck up at crawling a website which is authenticated. The tester
>> > has the login credentials but how do I pass these credentials to the
>> > wget is the issue here. I have seen that wget has a way to creating
>> > cookies and using them for further spidering, but thats not working
>> > for all kinds of web applications. Sometimes its able to store the
>> > cookie values, but sometimes it fails. Most of the time, I am seeing
>> > that the application is failed when it is webservices based.
>> Pentest - When a tick in the box is not enough
>> Paul Johnston - IT Security Consultant / Tiger SST
>> Pentest Limited - ISO 9001 (cert 16055) / ISO 27001 (cert 558982)
>> Office: +44 (0) 161 233 0100
>> Mobile: +44 (0) 7817 219 072
>> Email policy: http://www.pentest.co.uk/legal.shtml#emailpolicy
>> Registered Number: 4217114 England & Wales
>> Registered Office: 26a The Downs, Altrincham, Cheshire, WA14 2PU, UK
>> The Web Security Mailing List
>> WebSecurity RSS Feed
>> Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA
>> WASC on Twitter
>> websecurity at lists.webappsec.org
> Look and smile to the world and let the world smile to you
> The Web Security Mailing List
> WebSecurity RSS Feed
> Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA
> WASC on Twitter
> websecurity at lists.webappsec.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the websecurity