[WEB SECURITY] which is the best web application vulnerabilityscanner

Andre Gironda andreg at gmail.com
Wed May 4 14:26:50 EDT 2011


On Wed, May 4, 2011 at 10:40 AM,  <neza0x at gmail.com> wrote:
> "Use As many as possible scanners" mmmmm??? Technically could be, but in the real Corporate world, you only have some days to test and validate, so, more scanners, more time to run/validate and multiple different reports too generate. Without any consistency.

This is not true at all. There are many ways to aggregate/consolidate
scans and data from scans. Burp Suite Professional, along with Buby,
provide this capability in their combined API through the "send to
[rprogram]" interface. It's certainly possible to generate a single
report -- and The Dradis Framework provides hooks into HTML, MS Word,
PDF generation, Mediawiki, et al. This would only serve to make the
data MORE consistent and valuable.

> What if you find 10. Vulns in Webinspect, 5 in AppScan and 3 in Acunetix?
> When the developer says, please re-scan! Crap!!!

You obviously need to work better with app developers if they are
directing this language towards you.

> My recommendation is to use an scan that allows macro creation so that you can reach sections that the scanner itself cannot.
> Think of a process with 4 steps where the first one is to enter the invoice number or credit card, the scanner will never guess it and therefore not able to touch next steps/sections/urls/parameters.
> What about gotcha-enable apps??
>
> Webinspect supports macros, Acunetix (at least in the version I used years ago) only supports login macros.

The manual modes in WebInspect and Netsparker mimic what is capable
with Burp Suite Professional Scanner (including and especially the
passive scanner). However, free tools such as W3AF, Casaba Watcher,
and Watobo also do this. You could always use a browser macro
language/framework (e.g. iMacros, Selenium IDE, Selenium-RC/WebDriver,
Geb, Watir/WatiN/Watij, etc) do drive the browser through a passive,
local web proxy. This may give better results depending on the target
app. It even works well with Tamper Data since this add-on maintains a
list of URIs, parameters, headers in addition to timestamps (all
available from XML). I'm sure you will find that this methodology is
better than using an app scanner. It's not as scaleable today, but it
is more viable in the long-term. Have you seen the hardware
requirements for WI/AMP though? Talk about scalability issues...




More information about the websecurity mailing list