[WEB SECURITY] which is the best web application vulnerability scanner

Andre Gironda andreg at gmail.com
Wed May 4 03:52:18 EDT 2011

On Tue, May 3, 2011 at 12:06 PM, Arian J. Evans
<arian.evans at anachronic.com> wrote:
> By the way - love the humor on WASC today. I love that post
> about how the real answer is that we should all just write our
> own scanners and use tamper data, funniest thing I've read in
> months!

There appears to be some confusion about what I recommended to the
list, so I'm going to take the time to clarify it here. It wasn't
meant to be humorous at all -- or provide answers to anything except
for the original author's question.

> My personal recommendation is to learn the concepts in Tamper Data and
> to build on webappsec knowledge in order to write your own scanner(s).

I didn't say "replace web app scanners with Tamper Data", nor did I
imply that notion. I said to learn the concepts of webappsec using
Tamper Data -- i.e. in a lab setting, on your own hours/time, and at
your own learning pace.

> The ones that you build for yourself will always be "the best",
> because you're the customer (and you know yourself and your testing
> capabilities, especially test case design and test case organization
> along with time management and other principles).

The concept that application developers be capable of writing their
own test code, including a full test harness is not a new concept.
Developers have been applying these principles for over 40 years. Full
knowledge access is a general requirement for testing, which involves
a "whole system": source code, along with the build/compile
information, and the runtime. The capability to create code-aware
instrumentation available at runtime has been discussed in the
literature in early versions of Knuth's The Art of Computer
Programming -- long before the web (and most compilers) ever existed.

Crawlers are sometimes a good way to enhance test knowledge, but they
are not good at running test cases against a bespoke app. App scanners
usually have a crawling component, but they also have results-tables,
typically data-driven, and implemented in an HTTP state machine and/or
HTML/Javascript/etc parsing engine(s). This is typically known as a
protocol (HTTP) or application (DOM and plugins) driver.

While a quality front-end to an protocol driver can be found in Burp
Suite Professional -- it lacks the capability of a full browser
(Fiddler OTOH can hook into the browser's network traffic at a lower
interface -- in order to exist outside of the browser). This doesn't
make Burp Suite Professional useless by any means, but it does require
the same level-of-effort in terms of context provisioning that Fiddler
(or any tool) would. Typically, this is programmatically affected
using Burp or Fiddler extensions/plugins. This is where developing
your own scanner comes into play, especially useful for large and/or
bespoke applications. I don't know anyone outside of scanner vendors
(including SaaS scanner vendors) who don't promote this method -- and
I believe the scanner vendors only do so because they want to sell
more products.

Fortunately, for the people who don't want to build a scanner and do
not have large or bespoke apps -- they can utilize W3AF as suggested
by SAFEcode guidance. There are additional benefits to using W3AF, but
these typically require more than a point-and-shoot configuration
(note: it helps to have HTML, Ajax, Python, and Ruby knowledge at a
basic level -- as well as some basic Linux or Windows command line
skills). If you really want a point-and-shoot solution, then Arachni
will probably be a good fit.

More information about the websecurity mailing list