wasc-satec@lists.webappsec.org

WASC Static Analysis Tool Evaluation Criteria

View all threads

Criteria Second Draft - Deadline for voting - December 17th

SK
Sherif Koussa
Sun, Nov 27, 2011 1:21 AM

Hi All,

The second draft of the criteria is out. The second draft integrated all
the comments received from the first draft. Please send your comments
by*December 17th
*. If you don't have any comments, please reply back with "No Comments" in
the body of the email. If we get enough "No Comments" we will lock the
criteria for now and start fleshing out the contents.

The second draft could be found here
http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Looking forward to your feedback.

Regards,
Sherif

Hi All, The second draft of the criteria is out. The second draft integrated all the comments received from the first draft. Please send your comments by*December 17th *. If you don't have any comments, please reply back with "No Comments" in the body of the email. If we get enough "No Comments" we will lock the criteria for now and start fleshing out the contents. The second draft could be found here http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working Looking forward to your feedback. Regards, Sherif
J
jm
Sun, Nov 27, 2011 8:23 PM
  1. Tool Coverage:
    Suggest making "Coverage of Industry Standard Vulnerability Categories
    (OWASP Top 10, SANS Top 25…etc) " a subgroup of "Support for
    benchmarking."

2.7 Industry Standards Aided Analysis (support for benchmarking)
Comment: vendors will make a claim, but quality of coverage may be
left up to the user to establish. Facilitation?

  1. Reporting Capabilities
    Support for benchmarking reporting - Is the intent that reporting from
    2011 CWE/SANS Top 25 Most Dangerous Software Errors, OWASP Top 10,
    etc. be captured in template based support?

What about SCAP content support?

_jm

On 11/26/11, Sherif Koussa sherif.koussa@gmail.com wrote:

Hi All,

The second draft of the criteria is out. The second draft integrated all
the comments received from the first draft. Please send your comments
by*December 17th
*. If you don't have any comments, please reply back with "No Comments" in
the body of the email. If we get enough "No Comments" we will lock the
criteria for now and start fleshing out the contents.

The second draft could be found here
http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Looking forward to your feedback.

Regards,
Sherif

3. Tool Coverage: Suggest making "Coverage of Industry Standard Vulnerability Categories (OWASP Top 10, SANS Top 25…etc) " a subgroup of "Support for benchmarking." 2.7 Industry Standards Aided Analysis (support for benchmarking) Comment: vendors will make a claim, but quality of coverage may be left up to the user to establish. Facilitation? 5. Reporting Capabilities Support for benchmarking reporting - Is the intent that reporting from 2011 CWE/SANS Top 25 Most Dangerous Software Errors, OWASP Top 10, etc. be captured in template based support? What about SCAP content support? _jm On 11/26/11, Sherif Koussa <sherif.koussa@gmail.com> wrote: > Hi All, > > The second draft of the criteria is out. The second draft integrated all > the comments received from the first draft. Please send your comments > by*December 17th > *. If you don't have any comments, please reply back with "No Comments" in > the body of the email. If we get enough "No Comments" we will lock the > criteria for now and start fleshing out the contents. > > The second draft could be found here > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > > Looking forward to your feedback. > > Regards, > Sherif >
AZ
Alen Zukich
Mon, Nov 28, 2011 2:12 PM

Hi All,

My comments:

  • Not sure I understand 2.6?

  • I know there has been some discussions before on this but don't know where it left off.  What about performance?  Namely analysis time.  I don't think it is important to have vendors give numbers as that isn't useful, but one very important enterprise level feature users look for are things like incremental analysis, distributed analysis and multi-core support.

  • What about analysis type?  You are looking at a very different tool if the level of analysis is restricted to semantic or syntactical analysis.  However I'm not sure users will know what that means.

  • If there is one item that I always have to cover with any static analysis tool evaluation is licensing.  Do you license by users, LOC, other?  Thoughts on this?

alen

From: wasc-satec-bounces@lists.webappsec.org [mailto:wasc-satec-bounces@lists.webappsec.org] On Behalf Of Sherif Koussa
Sent: November-26-11 8:22 PM
To: wasc-satec@lists.webappsec.org
Subject: [WASC-SATEC] Criteria Second Draft - Deadline for voting - December 17th

Hi All,

The second draft of the criteria is out. The second draft integrated all the comments received from the first draft. Please send your comments by December 17th. If you don't have any comments, please reply back with "No Comments" in the body of the email. If we get enough "No Comments" we will lock the criteria for now and start fleshing out the contents.

The second draft could be found here http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Looking forward to your feedback.

Regards,
Sherif

Hi All, My comments: - Not sure I understand 2.6? - I know there has been some discussions before on this but don't know where it left off. What about performance? Namely analysis time. I don't think it is important to have vendors give numbers as that isn't useful, but one very important enterprise level feature users look for are things like incremental analysis, distributed analysis and multi-core support. - What about analysis type? You are looking at a very different tool if the level of analysis is restricted to semantic or syntactical analysis. However I'm not sure users will know what that means. - If there is one item that I always have to cover with any static analysis tool evaluation is licensing. Do you license by users, LOC, other? Thoughts on this? alen From: wasc-satec-bounces@lists.webappsec.org [mailto:wasc-satec-bounces@lists.webappsec.org] On Behalf Of Sherif Koussa Sent: November-26-11 8:22 PM To: wasc-satec@lists.webappsec.org Subject: [WASC-SATEC] Criteria Second Draft - Deadline for voting - December 17th Hi All, The second draft of the criteria is out. The second draft integrated all the comments received from the first draft. Please send your comments by December 17th. If you don't have any comments, please reply back with "No Comments" in the body of the email. If we get enough "No Comments" we will lock the criteria for now and start fleshing out the contents. The second draft could be found here http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working Looking forward to your feedback. Regards, Sherif
SK
Sherif Koussa
Fri, Dec 2, 2011 7:12 PM

JM,

Please find my replies below

Regards,
Sherif
On Sun, Nov 27, 2011 at 3:23 PM, jm sysvar0@gmail.com wrote:

  1. Tool Coverage:
    Suggest making "Coverage of Industry Standard Vulnerability Categories
    (OWASP Top 10, SANS Top 25…etc) " a subgroup of "Support for
    benchmarking."

Are you suggesting to change "Industry Standard Aided Analysis" to
"Coverage of Industry Vulnerability Categories" or do you mean introduce a
new sub-category?

2.7 Industry Standards Aided Analysis (support for benchmarking)
Comment: vendors will make a claim, but quality of coverage may be
left up to the user to establish. Facilitation?

Absolutely, I mainly look at this project as informative to SCA clients
more than providing a benchmark for the tools. I like to look at it as what
are those important things that you should be looking at, ask the vendor
these questions, and then they make their own judgement according to their
own environment.

  1. Reporting Capabilities
    Support for benchmarking reporting - Is the intent that reporting from
    2011 CWE/SANS Top 25 Most Dangerous Software Errors, OWASP Top 10,
    etc. be captured in template based support?

If the tool scans according to a certain criteria (2011 CWE/SANS 25, OWASP
10...etc) then the report should be able to reflect only that, no?

What about SCAP content support?

SCAP is not covered in this Phase unless most people think there is a value
in adding it?

_jm

On 11/26/11, Sherif Koussa sherif.koussa@gmail.com wrote:

Hi All,

The second draft of the criteria is out. The second draft integrated all
the comments received from the first draft. Please send your comments
by*December 17th
*. If you don't have any comments, please reply back with "No Comments"

in

the body of the email. If we get enough "No Comments" we will lock the
criteria for now and start fleshing out the contents.

The second draft could be found here

Looking forward to your feedback.

Regards,
Sherif

JM, Please find my replies below Regards, Sherif On Sun, Nov 27, 2011 at 3:23 PM, jm <sysvar0@gmail.com> wrote: > 3. Tool Coverage: > Suggest making "Coverage of Industry Standard Vulnerability Categories > (OWASP Top 10, SANS Top 25…etc) " a subgroup of "Support for > benchmarking." > Are you suggesting to change "Industry Standard Aided Analysis" to "Coverage of Industry Vulnerability Categories" or do you mean introduce a new sub-category? > > 2.7 Industry Standards Aided Analysis (support for benchmarking) > Comment: vendors will make a claim, but quality of coverage may be > left up to the user to establish. Facilitation? > Absolutely, I mainly look at this project as informative to SCA clients more than providing a benchmark for the tools. I like to look at it as what are those important things that you should be looking at, ask the vendor these questions, and then they make their own judgement according to their own environment. > > 5. Reporting Capabilities > Support for benchmarking reporting - Is the intent that reporting from > 2011 CWE/SANS Top 25 Most Dangerous Software Errors, OWASP Top 10, > etc. be captured in template based support? > If the tool scans according to a certain criteria (2011 CWE/SANS 25, OWASP 10...etc) then the report should be able to reflect only that, no? > What about SCAP content support? > SCAP is not covered in this Phase unless most people think there is a value in adding it? > _jm > > > On 11/26/11, Sherif Koussa <sherif.koussa@gmail.com> wrote: > > Hi All, > > > > The second draft of the criteria is out. The second draft integrated all > > the comments received from the first draft. Please send your comments > > by*December 17th > > *. If you don't have any comments, please reply back with "No Comments" > in > > the body of the email. If we get enough "No Comments" we will lock the > > criteria for now and start fleshing out the contents. > > > > The second draft could be found here > > > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > > > > Looking forward to your feedback. > > > > Regards, > > Sherif > > >
SK
Sherif Koussa
Fri, Dec 2, 2011 7:21 PM

Please find my replies below.

On Mon, Nov 28, 2011 at 9:12 AM, Alen Zukich alen.zukich@klocwork.comwrote:

Hi All,****


My comments:****


  • Not sure I understand 2.6?

So for example, let's say that I have non-typical file extensions in my
application, it is necessary to be able to tell the tool to configure the
tool to understand these extensions, for example, let's say I have JSPFs
(Java Server Pages Fragments) in my app, these are just JSP files, I would
want to re-configure the tool to treat them as so, now this is a trivial
example, some applications would create their own extensions, and it would
be important for the tool to understand what these files are otherwise they
will just be ignored.



  • I know there has been some discussions before on this but don’t know
    where it left off.  What about performance?  Namely analysis time.  I don’t
    think it is important to have vendors give numbers as that isn’t useful,
    but one very important enterprise level feature users look for are things
    like incremental analysis, distributed analysis and multi-core support.

So the distributed analysis and multi-core are supposed to be covered in
1.3, it might not show that but it will once it is fleshed out. For
analysis time, it would really depend on the combination of the following:
tool + application + environment. This is something that would be left for
the buyer to test for themselves



  • What about analysis type?  You are looking at a very different tool if
    the level of analysis is restricted to semantic or syntactical analysis.
    However I’m not sure users will know what that means.

From the user's perspective, they just want two things: coverage and

accuracy, how the tool achieves this, they don't really care.



  • If there is one item that I always have to cover with any static
    analysis tool evaluation is licensing.  Do you license by users, LOC,
    other?  Thoughts on this?

That's an important one, I think we need to add this.



alen****


From: wasc-satec-bounces@lists.webappsec.org [mailto:
wasc-satec-bounces@lists.webappsec.org] On Behalf Of Sherif Koussa
Sent: November-26-11 8:22 PM
To: wasc-satec@lists.webappsec.org
Subject: [WASC-SATEC] Criteria Second Draft - Deadline for voting -
December 17th
**


Hi All,****


The second draft of the criteria is out. The second draft integrated all
the comments received from the first draft. Please send your comments byDecember 17th
. If you don't have any comments, please reply back with "No Comments"
in the body of the email. If we get enough "No Comments" we will lock the
criteria for now and start fleshing out the contents.
**


The second draft could be found here
http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working



Looking forward to your feedback.****


Regards,
Sherif****

Please find my replies below. On Mon, Nov 28, 2011 at 9:12 AM, Alen Zukich <alen.zukich@klocwork.com>wrote: > Hi All,**** > > ** ** > > My comments:**** > > ** ** > > - Not sure I understand 2.6? > So for example, let's say that I have non-typical file extensions in my application, it is necessary to be able to tell the tool to configure the tool to understand these extensions, for example, let's say I have JSPFs (Java Server Pages Fragments) in my app, these are just JSP files, I would want to re-configure the tool to treat them as so, now this is a trivial example, some applications would create their own extensions, and it would be important for the tool to understand what these files are otherwise they will just be ignored. **** > > ** ** > > - I know there has been some discussions before on this but don’t know > where it left off. What about performance? Namely analysis time. I don’t > think it is important to have vendors give numbers as that isn’t useful, > but one very important enterprise level feature users look for are things > like incremental analysis, distributed analysis and multi-core support. > So the distributed analysis and multi-core are supposed to be covered in 1.3, it might not show that but it will once it is fleshed out. For analysis time, it would really depend on the combination of the following: tool + application + environment. This is something that would be left for the buyer to test for themselves **** > > ** ** > > - What about analysis type? You are looking at a very different tool if > the level of analysis is restricted to semantic or syntactical analysis. > However I’m not sure users will know what that means. > >From the user's perspective, they just want two things: coverage and accuracy, how the tool achieves this, they don't really care. > **** > > ** ** > > - If there is one item that I always have to cover with any static > analysis tool evaluation is licensing. Do you license by users, LOC, > other? Thoughts on this? > That's an important one, I think we need to add this. > **** > > ** ** > > alen**** > > ** ** > > *From:* wasc-satec-bounces@lists.webappsec.org [mailto: > wasc-satec-bounces@lists.webappsec.org] *On Behalf Of *Sherif Koussa > *Sent:* November-26-11 8:22 PM > *To:* wasc-satec@lists.webappsec.org > *Subject:* [WASC-SATEC] Criteria Second Draft - Deadline for voting - > December 17th**** > > ** ** > > Hi All,**** > > ** ** > > The second draft of the criteria is out. The second draft integrated all > the comments received from the first draft. Please send your comments by*December 17th > *. If you don't have any comments, please reply back with "No Comments" > in the body of the email. If we get enough "No Comments" we will lock the > criteria for now and start fleshing out the contents.**** > > ** ** > > The second draft could be found here > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > **** > > ** ** > > Looking forward to your feedback.**** > > ** ** > > Regards, > Sherif**** >
SK
Sherif Koussa
Mon, Dec 12, 2011 5:49 PM

All,

5 more days available for voting on the second round. Please take a few
moments to look at the updated version of the categories and
sub-categories. Your opinion really does matter.

http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Regards,
Sherif

On Sat, Nov 26, 2011 at 8:21 PM, Sherif Koussa sherif.koussa@gmail.comwrote:

Hi All,

The second draft of the criteria is out. The second draft integrated all
the comments received from the first draft. Please send your comments by*December 17th
*. If you don't have any comments, please reply back with "No Comments"
in the body of the email. If we get enough "No Comments" we will lock the
criteria for now and start fleshing out the contents.

The second draft could be found here
http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Looking forward to your feedback.

Regards,
Sherif

All, 5 more days available for voting on the second round. Please take a few moments to look at the updated version of the categories and sub-categories. Your opinion really does matter. http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working Regards, Sherif On Sat, Nov 26, 2011 at 8:21 PM, Sherif Koussa <sherif.koussa@gmail.com>wrote: > Hi All, > > The second draft of the criteria is out. The second draft integrated all > the comments received from the first draft. Please send your comments by*December 17th > *. If you don't have any comments, please reply back with "No Comments" > in the body of the email. If we get enough "No Comments" we will lock the > criteria for now and start fleshing out the contents. > > The second draft could be found here > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > > Looking forward to your feedback. > > Regards, > Sherif >
HS
Henri Salo
Tue, Dec 13, 2011 7:36 PM

On Mon, Dec 12, 2011 at 12:49:48PM -0500, Sherif Koussa wrote:

All,

5 more days available for voting on the second round. Please take a few
moments to look at the updated version of the categories and
sub-categories. Your opinion really does matter.

http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working

Regards,
Sherif

I think these categories are fine. Nothing to add. Are these something that absolutely can't be changed after writing criteria is on-going? Please contact me if you need any kind of help regarding this project. I am more than happy to help.

  • Henri Salo
On Mon, Dec 12, 2011 at 12:49:48PM -0500, Sherif Koussa wrote: > All, > > 5 more days available for voting on the second round. Please take a few > moments to look at the updated version of the categories and > sub-categories. Your opinion really does matter. > > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > > Regards, > Sherif I think these categories are fine. Nothing to add. Are these something that absolutely can't be changed after writing criteria is on-going? Please contact me if you need any kind of help regarding this project. I am more than happy to help. - Henri Salo
SK
Sherif Koussa
Tue, Dec 13, 2011 8:03 PM

Changes after locking down the criteria would be very minimal but not
impossible.

The next step after locking down the criteria would be to flesh them out
and maybe create assisting tools (like WASSEC's excel sheet). After this is
done, probably and if there is a need we can revisit the criteria based on
feedback and open them again for changes.

Sherif

On Tue, Dec 13, 2011 at 2:36 PM, Henri Salo henri@nerv.fi wrote:

On Mon, Dec 12, 2011 at 12:49:48PM -0500, Sherif Koussa wrote:

All,

5 more days available for voting on the second round. Please take a few
moments to look at the updated version of the categories and
sub-categories. Your opinion really does matter.

Regards,
Sherif

I think these categories are fine. Nothing to add. Are these something
that absolutely can't be changed after writing criteria is on-going? Please
contact me if you need any kind of help regarding this project. I am more
than happy to help.

  • Henri Salo

wasc-satec mailing list
wasc-satec@lists.webappsec.org
http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org

Changes after locking down the criteria would be very minimal but not impossible. The next step after locking down the criteria would be to flesh them out and maybe create assisting tools (like WASSEC's excel sheet). After this is done, probably and if there is a need we can revisit the criteria based on feedback and open them again for changes. Sherif On Tue, Dec 13, 2011 at 2:36 PM, Henri Salo <henri@nerv.fi> wrote: > On Mon, Dec 12, 2011 at 12:49:48PM -0500, Sherif Koussa wrote: > > All, > > > > 5 more days available for voting on the second round. Please take a few > > moments to look at the updated version of the categories and > > sub-categories. Your opinion really does matter. > > > > > http://projects.webappsec.org/w/page/42093482/Static%20Analysis%20Tool%20Evaluation%20Criteria%20Working > > > > Regards, > > Sherif > > I think these categories are fine. Nothing to add. Are these something > that absolutely can't be changed after writing criteria is on-going? Please > contact me if you need any kind of help regarding this project. I am more > than happy to help. > > - Henri Salo > > _______________________________________________ > wasc-satec mailing list > wasc-satec@lists.webappsec.org > http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org >
G
gueb
Fri, Dec 16, 2011 3:30 PM

My comments (sorry for the english):

  1. Platform support
    [BG] split 1.3 -> (a)ability to support multi-core (b)ability to chain
    machines (c)ability to manage a schedule for build-server scan (to
    avoid 5 projects to choose 2am)
    [BG] Infrastructure requirement for deployment (ex: large number of servers)

  2. Technology Support
    [BG] Support for refactoring of core (will the tool lose all the
    false-positive marking)
    [BG] Tool need the code to be compiled before scanning?
    [BG] Analysis type -> flow analysis, need compiled code, etc.

  3. Scan, Command and Control Support
    [BG] Tool can connect to a source repository to get the source before scanning?
    [BG] Does marking a false positive need a full rescan of code?

  4. Reporting Capabilities
    [BG] Ability to customize all text in the report (custom sample)
    [BG] Marking false positive dynamically update the reports, or a
    rescan is required?
    [BG] Rescan required if we would like to show top3 instead of top10?
    [BG] Report include a snippet of the vulnerable code?

  5. Triage and Remediation Support
    [BG] Tool can show the vulnerable code by double-clicking on a
    vulnerability (like uploading the code on the server after a scan for
    centralized viewing)

My comments (sorry for the english): 1. Platform support [BG] split 1.3 -> (a)ability to support multi-core (b)ability to chain machines (c)ability to manage a schedule for build-server scan (to avoid 5 projects to choose 2am) [BG] Infrastructure requirement for deployment (ex: large number of servers) 2. Technology Support [BG] Support for refactoring of core (will the tool lose all the false-positive marking) [BG] Tool need the code to be compiled before scanning? [BG] Analysis type -> flow analysis, need compiled code, etc. 3. Scan, Command and Control Support [BG] Tool can connect to a source repository to get the source before scanning? [BG] Does marking a false positive need a full rescan of code? 5. Reporting Capabilities [BG] Ability to customize all text in the report (custom sample) [BG] Marking false positive dynamically update the reports, or a rescan is required? [BG] Rescan required if we would like to show top3 instead of top10? [BG] Report include a snippet of the vulnerable code? 6. Triage and Remediation Support [BG] Tool can show the vulnerable code by double-clicking on a vulnerability (like uploading the code on the server after a scan for centralized viewing)