[WEB SECURITY] Crawl Form based authenticated website
Paulus Junior Lazuardi
paulusjrlz at gmail.com
Wed Jul 25 06:40:14 EDT 2012
Apache JMeter perhaps?
On Tue, Jul 24, 2012 at 6:04 PM, Paul Johnston
<paul.johnston at pentest.co.uk>wrote:
> Hi Ruby,
> The usual approach is to manually login to the site, then extract the
> cookie from your browser's cookie store. You can then pass this to wget
> using the --load-cookies option.
> Some software does try to automate this, and some even does it
> reasonably reliably, but login processes vary so much in the wild, it is
> difficult to do.
> > I have used wget to perform spidering. it was working fine. but now i
> > am struck up at crawling a website which is authenticated. The tester
> > has the login credentials but how do I pass these credentials to the
> > wget is the issue here. I have seen that wget has a way to creating
> > cookies and using them for further spidering, but thats not working
> > for all kinds of web applications. Sometimes its able to store the
> > cookie values, but sometimes it fails. Most of the time, I am seeing
> > that the application is failed when it is webservices based.
> Pentest - When a tick in the box is not enough
> Paul Johnston - IT Security Consultant / Tiger SST
> Pentest Limited - ISO 9001 (cert 16055) / ISO 27001 (cert 558982)
> Office: +44 (0) 161 233 0100
> Mobile: +44 (0) 7817 219 072
> Email policy: http://www.pentest.co.uk/legal.shtml#emailpolicy
> Registered Number: 4217114 England & Wales
> Registered Office: 26a The Downs, Altrincham, Cheshire, WA14 2PU, UK
> The Web Security Mailing List
> WebSecurity RSS Feed
> Join WASC on LinkedIn http://www.linkedin.com/e/gis/83336/4B20E4374DBA
> WASC on Twitter
> websecurity at lists.webappsec.org
Look and smile to the world and let the world smile to you
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the websecurity