[WEB SECURITY] Crawl Form based authenticated website

Paul Johnston paul.johnston at pentest.co.uk
Tue Jul 24 07:04:11 EDT 2012


Hi Ruby,

The usual approach is to manually login to the site, then extract the
cookie from your browser's cookie store. You can then pass this to wget
using the --load-cookies option.

Some software does try to automate this, and some even does it
reasonably reliably, but login processes vary so much in the wild, it is
difficult to do.

Paul


> I have used wget to perform spidering. it was working fine. but now i
> am struck up at crawling a website which is authenticated. The tester
> has the login credentials but how do I pass these credentials to the
> wget is the issue here. I have seen that wget has a way to creating
> cookies and using them for further spidering, but thats not working
> for all kinds of web applications. Sometimes its able to store the
> cookie values, but sometimes it fails. Most of the time, I am seeing
> that the application is failed when it is webservices based.

-- 
Pentest - When a tick in the box is not enough

Paul Johnston - IT Security Consultant / Tiger SST
Pentest Limited - ISO 9001 (cert 16055) / ISO 27001 (cert 558982)

Office: +44 (0) 161 233 0100
Mobile: +44 (0) 7817 219 072

Email policy: http://www.pentest.co.uk/legal.shtml#emailpolicy
Registered Number: 4217114 England & Wales
Registered Office: 26a The Downs, Altrincham, Cheshire, WA14 2PU, UK





More information about the websecurity mailing list