[WEB SECURITY] 2009 Top 25 Programming Errors

Micah Tapman m.tapman at questconsultantsllc.com
Thu Jan 15 18:21:30 EST 2009


I'll dissent from this list's most vocal constituency by saying that I like
the CWE work, find it insightful (especially in terms of root cause
identification), and most likely a terrific step in the right direction from
a big picture perspective.

I think this is exactly the type of language (with minor tweaks) that
resonates with semi-technical managers and even executives at large
organizations. I am particularly intrigued by the "Structure" terms under
Risky Resource Management, which I think are an interesting way to discuss
very esoteric ideas like cross-site scripting. Certainly there are some
pretty complex terms, such as Race Condition, but I'm not sure what would be
better and still accurate.

An example of why I like this list might be helpful here...

Take CWE-20, Improper Input Validation and assume we're talking to an
executive at a bank about a situation where an application allows a user to
enter data and the application never checks the data or doesn't check it
well enough. Our approach is to use an analogy based on client loan
applications, which are native terrain for a bank executive. "Imagine a
person applies for a loan by filling out a form, however, your loan officer
never checks the person's name against a driver's license or other form of
legal identification. You could end up issuing a loan to someone using a
false name and never be able to track that person down." Hopefully, this
little example conveys the message that all input should be verified before
the organization considers it trusted. In my experience it's these little
stories that can make or break a briefing to executives. Our objective,
talking with the executive, is to gain support for an initiative to make
sure all applications under the bank's control use proper input validation
mechanisms.

I will agree with some of the people writing before me on this list that
it's likely to cause some changes, and potentially troublesome changes, like
mandates by legislative bodies to "follow" these guidelines...but hey,
that's not a problem caused by the list, it's a problem caused by uneducated
and misguided lawmakers and we all know that in the absence of a list
they'll invent something.

Micah

-----Original Message-----
From: Steven M. Christey [mailto:coley at linus.mitre.org] 
Sent: Thursday, January 15, 2009 4:00 PM
To: Arian J. Evans
Cc: Steven M. Christey; websecurity at webappsec.org
Subject: Re: [WEB SECURITY] 2009 Top 25 Programming Errors


On Thu, 15 Jan 2009, Arian J. Evans wrote:

> I believe the language and the misguided "remediation cost" sections
> of the Top 25 do just that.

The remediation costs were an attempt to help people sub-prioritize within
the list - which things could they knock off right now.  The potential
costs are at least defined, limited as they may be.  I can't find them in
the OWASP Top Ten or the WASC Classification.

> I do not see this current Top 25 document being data-centric at all.
> What data was used in the creation process? Or was it simply the biased
> sample of a Mitre-mailing-list democracy? (not criticizing you here,
> just asking)

The Top 25 FAQ covers this (not to mention other concerns already
mentioned on this list):

 Why don't you use hard statistics to back up your claims?

  The appropriate statistics simply aren't publicly available. The
  publicly available statistics are either too high-level or not
  comprehensive enough. And none of them are comprehensive across all
  software types and environments.

  For example, in the CVE vulnerability trends report for 2006, 25% of all
  reported CVE's either had insufficient information, or could not be
  characterized using CVE's 37 flaw type categories. That same report only
  covers publicly-reported vulnerabilities, so the types of issues may be
  distinct from what security consultants find (and they're often
  prohibited from disclosing their findings).

  Finally, some of the Top 25 captures the primary weaknesses (root
  causes)  of vulnerabilities - such as CWE-20, CWE-116, and CWE-73.
  However, these root causes are rarely reported or tracked.

In other words - we could have used CVE stats just like was done for the
2007 OWASP Top Ten, but we don't think they're adequate for covering the
actual mistakes that programmers make.  I'm fairly sure that the 2009
OWASP Top Ten effort is going to try alternate tactics.

The list of contributors is here:

  http://cwe.mitre.org/top25/contributors.html

I would say that it is fairly diverse.

> The unique kind of thing you are addressing with this new standard is
> that it this effort is dabbling in business case. Previous SANS lists
> did not. This is a different kind of issue than server configs and IIS
> patches.

Agree.  I view it as a strength of the Top 25 that it even attempts to
define a threat model of the skilled, determined attacker (Appendix B),
which helps with prioritization.  The OWASP Top Ten and WASC don't seem to
do this - there's some threat that's implicit in the selection of which
issues are important.

> My motive on this list was to push a "competing" standard. Via WASC or
> OWASP.

I would be likely to support something that is more "complementary" than
"competing" (the 2009 OWASP Top Ten is coming up and I fully intend to
support that).

>CWE for better or for worse draws a very academic crowd

"Pure" CWE does, I agree.  The Top 25 less so (at least that's the
intention).

> I want something simple and effective that is controlled by people in
> the actual hands-on web security community.

It should be noted that the Top 25 is intended to cover far more than web
apps, which may be where some of the disconnect comes from.

> I was very frustrated by lack of taxonomy and hierarchy confusion
> between uses of Risk, Threat, Attack, Weakness, and Vulnerability, and
> your efforts here are long overdue.

BTW I still don't have a clear mindset for all of these either :-/

> I think that this thing needs to be approached and treated as a minimum
> standard of due care (in testing, building, measuring) because that is
> how it's going to be used.

We are trying to promote it as such, but that's not the message that's
being heard, as you and others are pointing out.

> Security people shoot themselves in the foot by telling business owners
> and developers that they have failed, and that it won't cost much to
> fix.

I see your point here and agree it's a larger problem.  I don't think
developers are receiving it this way, however.

> I do not want to use your document and language for communicating with
> my clients, but the current initiative guarantees that we all will be
> using it as a standard.

A thought-provoking comment.  But, frankly, I have faith in the consulting
community to educate their customers about why just caring about the Top
25 is naive.

- Steve

----------------------------------------------------------------------------
Join us on IRC: irc.freenode.net #webappsec

Have a question? Search The Web Security Mailing List Archives: 
http://www.webappsec.org/lists/websecurity/archive/

Subscribe via RSS: 
http://www.webappsec.org/rss/websecurity.rss [RSS Feed]

Join WASC on LinkedIn
http://www.linkedin.com/e/gis/83336/4B20E4374DBA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/x-pkcs7-signature
Size: 4590 bytes
Desc: not available
URL: <http://lists.webappsec.org/pipermail/websecurity_lists.webappsec.org/attachments/20090115/3410dd2e/attachment.p7s>


More information about the websecurity mailing list