wasc-satec@lists.webappsec.org

WASC Static Analysis Tool Evaluation Criteria

View all threads

SATEC Draft is Ready

SK
Sherif Koussa
Wed, Nov 28, 2012 1:15 AM

Hi Romain,

Thanks for the detailed comments. Please find my replies inline.

Regards,
Sherif

On Fri, Nov 23, 2012 at 2:44 PM, Romain Gaucher romain@webappsec.orgwrote:

Everyone,
I had a quick look at the SATEC and commented on some sections. Sorry to
attach the comments like this, but emails aren't just great for that (*RG:

  • is the start of my comment and I tried to use rich text stuff to make
    it more separated from the text). I left only the parts of the SATEC I
    commented on.
    A general feedback is that the SATEC is really oriented toward a very
    small subset of the static analysis tools available and mostly talk about
    web application. I believe the effort should be taken to make this SATEC
    more generic.

Sherif: I believe it is natural for the SATEC to be slightly oriented
towards web applications since we are only after security natured SCA
tools. Are you suggesting the SATEC to be more generic within the security
oriented SCA tools or generic in the context of static code analysis in
general?

Also, an area that's not touched upon is the usability of the tool:
interface it provides, workflow, integration w/ bugs management systems,
etc.

Sherif: In terms of usability, I believe we agreed to avoid any relative or
hard to assess criteria, if my memory is accurate, I think this was your
suggestion :)
In terms of bug management, I believe this is mentioned in 7.1

In addition, it would be good to provide links about other entities who
did evaluation criteria for static analysis tools. I find it kinda sad that
NIST SAMATE is not even mentioned once, I'm really wondering if the authors
looked at these specs (
http://samate.nist.gov/index.php/Source_Code_Security_Analysis.html)
before writing this document. NIST also gets test suites from another
government entity (not sure if I can say who) to test for coverage. This
might be interesting to point it too.

Sherif: We should make reference for sure when it makes sense, but my
question is in what context should we refer to NIST, for example should we
refer to the document as "Similar projects"? Do you see a specific location
in the document where we should have made a reference to NIST's?

If it's unknown to some, I'm working for a tool vendor and you could
therefore consider my take on this tainted :), but I tried to stay factual.

Also, there are several typos in the document; copy/paste into Word should
fix this.

Cheers,
Romain

1. Platform Support:

*Static code analysis tools are represent a significant investment by
software organizations looking to automate parts of their software security
testing and quality assurance processes. Not only do they represent a
monetary investment, but  they also demand time and effort by staff members
to setup, operate, and maintain the tool. This, in addition to checking and
acting upon the results produced by the tool. Understanding the ideal
deployment environment for the tool will maximize the return on investment
and will avoid unplanned hardware purchase cost. The following factors are
essential to understanding the tool's capabilities and hence ensuring
proper utilization of the tool which will reflect positively on
tool utilization and ensuring maximum return on investment (ROI). *

1.2 Scalability Support:

Vendors provide various deployment options for their tools. Clear
description of the different deployment options must be provided by the
vendor to maximize the tool's usage. In addition, the vendor must specify
the optimal operating conditions. At a minimum the vendor must specify:

- *The type of deployment: server-side vs client-side as this might
incur hardware purchase.*
- *Ability to multi-chain several machines to achieve more scan speed.*
- *Ability to run simultaneous scans at the same time.*

*RG: Why isn't parallelism mentioned? To achieve speed, if you can have
many threads running the different checks it's much faster. Hence the
advantages of having multi-core/cpu machines. There are multiple things to
be considered:

  • Ability to analyze multiple applications on one or multiple machines
  • Ability to speed up the analysis of one application*

Sherif: Makes sense, I summed all these in one bullet point: "Speed",
please review and let me know if this covers it properly.

2. Technology Support:

Most organizations leverage more than one programming language within
their applications portfolio. In addition, more software frameworks are
becoming mature enough for development teams to leverage and use across the
board. In addition, to a score of 3rd party libraries, technologies, both
server and client side. Once these technologies, frameworks and libraries
are integrated into an application, they become part of it and the
application inherits any vulnerability within these components. It is vital
for the static code analysis tool to be able to understand and analyse, not
only the application, but the libraries, frameworks and technologies
supporting the application.

RG: There is a big misconception here I believe. You don't want to
analyze the framework, but how the applications use the framework. It would
be ridiculous to scan the frameworks every time an application use them,
but if the static analysis do not understand the important frameworks
(control, data, and view) then it will miss most of the behavior of the
application. This is fine for some quality analysis, but security checks
are usually more global and require such understanding.

Sherif: I don't think it was the intention at all to want to analyze the
framework. The main point this section is trying to make is whether the
tool leverages the framework in secure manner, and the other point is for
the tool not to blank with the several layers of abstractions the
frameworks often offers. However, I see your point that there are a couple
of lines where it could be interpreted as we might be asking the tool to
analyze the framework. I changed this section a bit, please review.

2.1 Standard Languages Support:

Most of the tools support more than one programming language. However,
an organization looking to purchase a static code analysis tool should make
an inventory of all the programming languages used inside the organizations
as well as third party applications that will be scanned as well. After
shortlisting all the programming languages, an organization should compare
the list against the tool’s supported list of programming languages.

*RG: Languages and versions. If you use C++11 a lot in one app, make sure
that the frontend of the analysis tool will understand it. Also,
applications such as web apps use several languages in the same app (SQL,
Java, JavaScript and HTML is a very simple stack for example); does the
tool understand all of these languages and is able to track the behavior of
the program when it passes data or call into another language? Example:
stored procedures. Is it understood where the data is actually coming from?
*

Sherif: I am not sure I understand the point you are trying to make here.

*2.2 Frameworks Support:
*

Once an application is built on a top of a framework, the application
inherits any vulnerability in that framework. In addition, depending on how
the application leverages a framework or a library, it can add new attack
vectors. It is very important for the tool to be able to be able to trace
tainted data through the framework as well as the custom modules built on
top of it. At large, frameworks and libraries can be classified to two
types:

- *Server-side Frameworks: which are the frameworks/libraries that
reside on the server, e.g. Spring, Struts, Rails, .NET etc.*
- *Mobile Frameworks: which are the frameworks that are used on mobile
devices, e.g. Android, iOS, Windows Mobile etc.*
- *Client-side Frameworks: which are the frameworks/libraries that
reside on browsers, e.g. JQuery, Prototype, etc.*

The tool should understand the relationship between the application and
the frameworks/libraries. Ideally, the tool would also be able to follow
tainted data between different frameworks/libraries.

RG: There is a lot to be said on framework. I don't especially like the
separation between server-side, mobile, and client-side. For a static
analysis point of view, that doesn't matter so much, those are all
programs. Frameworks have interesting properties and different features.
Some will manage the data and database (ORM, etc.), some will be
responsible for the flow in the application (spring mvc, struts, .net mvc),
and some will render the view (jasper reports, asp, freemarker, asp.netpages, jinja, etc.). This is to me the important part of the framework,
understand what the framework is doing to the application.

The support of framework should be tested and well defined by the tool
vendor: does it understand configuration files, etc.? What feature of the
framework doesn't it understand?

Sherif: I think your classification makes sense. Would you be able to
author this part? or write some bullet points that we could take and flesh
out more?

*2.3 Industry Standards Aided Analysis:
*

The tool should be able to provide analysis that is tailored towards one
of the industry standard weaknesses classification, e.g. OWASP Top 10,
CWE/SANS Top 25, WASC Threat Classification, etc. This becomes a desirable
feature for many reasons. For example, an organization that just started
its application security program, a full standard scan might prove
overwhelming, especially with an extensive portfolio of applications.
Focusing on a specific industry standard in this case would be a good place
to start for that particular organization.

  • RG: OWASP and WASC aren't weaknesses classifications. I would prefer to
    see the emphasis on CWE since this is the real only one weaknesses
    classification out there.*

Sherif: The main point here is that organizations are trying to set a goal
and reach it. Our target audience would  know OWASP and WASC but maybe not
necessarily know CWE. I don't mind adding CWE for sure, but I would argue
that OWASP and WASC provide organizations a goals to reach (e.g. We cover
OWASP Top 10, SANS Top 25..etc)

*In term of "aided analysis", I believe it's more important to talk about

the ability to enable or disable some checks based on what the
security/development team need. *

Sherif: please explain

3. Scan, Command and Control Support

The scan, command and control of static code analysis tools have a
significant influence on the user’s ability to make the best out of the
tool. This affects both the speed and effectiveness of processing findings
and remediating them.

3.3 Customization:

The tool usually comes with a set of signatures, this set is usually
followed by the tool to uncover the different weaknesses in the sourse
code. Static code analysis should offer a way to extend these signatures in
order to customize the tool's capabilities of detecting new weaknesses,
alter the way the tool detect weaknesses or stop the tool from detecting a
specific pattern. The tool should allow users to:

- *Users should be able to add/delete/modify core signatures: Core
signatures are the signatures that come bundled with the application. False
positives is one of the inherit flaws in static code analysis tools in
general. One way to minimize this problem is to optimize the tool’s
core signatures, e.g. mark a certain source as safe input.*
- *Users should be able to author custom signatures: This feature is
almost invaluable to maximize the tool’s benefits. For example, a
custom signature might be needed to “educate” the tool of the existence of
a custom cleansing module so to start flagging lines that do not use that
module or stop flagging lines that do use it.*

RG: Can we make this a bit more generic? Signatures or rules are just
one way of accomplishing customization. I can think of few directions:

- Ability to enable/disable/modify the understanding of frameworks:
either create custom rules, checkers, or generic frameworks definition
(this construct means this stuff)

- Ability to create new checkers, detect new/customized types of issues

- Ability to override the core knowledge of the tool

*- Ability to override the core remediation advices *

Sherif: well, we gotta agree on one term, either "rule", "checker" or
"signature". If we used any of the three, it will still be biased towards
one tool, correct? Wikipedia is using "rules" in the article here
http://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis. I
would vote for the term that our target audience would probably understand.

3.4 Scan configuration capabilities: this includes:

- *Ability to schedule scans: scheduled scan are often a mandatory
features. Scans are often scheduled after nightly builds, some other times
they are scheduled when the CPU usage as at its minimum. Therefore, it is
important for the user to be able to schedule the scan to run at a
particular time.*
- *Ability to view real-time status of running scans: some scans would
take hours to finish, it would be beneficial and desirable for a user to be
able to see the scan’s progress and weaknesses found thus far.*
- *Ability to save configurations and re-use them as configuration
templates: Often a significant amount of time and effort is involved in
optimally configuring a static code analyser for a particular application.
A tool should provide the user with the ability to save a scan's
configuration so that it can be re-used for later scans.*
- *Ability to run multiple scans simultaneously: Organizations that
have many applications to scan, will find the ability to run simultaneous
scans to be a desirable feature.*
- *Ability to support multiple users: this is important for
organizations which are planning to rollout the tool to be used by
developers or organizations which are planning to scan large applications
that require more than one engineer to assess at the same time.*

*RG: How about the ability to support new compilers?  *

Sherif: Please explain.

3.5 Testing Capabilities:

Scanning an application for weaknesses is the sole most important
functionality of the tool. It is essential for the tool to be able to
understand, accurately identify and report the following attacks and
security weaknesses.

- *Abuse of Functionality*
- *Application Misconfiguration*
- *Auto-complete Not Disabled on Password Parameters *
- *Buffer Overflow*
- *Credential/Session Prediction*
- *Cross-site Scripting*
- *Cross-site Request Forgery*
- *Denial of Service*
- *Insecure Cryptography *
- *Format String*
- *HTTP Response Splitting*
- *Improper Input Handling*
- *Improper Output Encoding*
- *Information Leakage*
- *Insufficient Authentication*
- *Insufficient Authorization*
- *Insufficient Session Expiration*
- *Integer Overflows*
- *LDAP Injection*
- *Mail Command Injection*
- *Null Byte Injection*
- *OS Command Injection*
- *Path Traversal*
- *Remote File Inclusion*
- *Session Fixation*
- *SQL Injection*
- *URL Redirection Abuse*
- *XPATH Injection*
- *XML External Entities*
- *XML Entity Expansion*
- *XQuery Injection*

RG: Okay for webapps, what about the rest? Also, some are very
generic… "information leakage" what does it me to "accurately identify and
report" this? Note that this is a non solvable problem with
static analysis techniques. Also, a static analysis tool cannot report
"attacks" since it doesn't have enough information about the runtime.

Generally, the testing capability should be a very large section and the
focus should be "how well are these covered?". Several open-source
tools have a large testing capability but will generate tons of FP. The
accuracy is important, and there is no real way to test for it but to
actually use the tool on one of your application and see what it finds.

Sherif: Can you list some of the weaknesses that were missed here?

*4. Product Signature Update  *

Product signatures is what the static code analysis tool use to identify
security weaknesses. When making a choice of a static analysis tools, one
should take into consideration the following:

RG: Can we move away from "signature"? I mean this is really biased
towards some tools and some kind of analysis. If you take findbugs/clang
they don't use signatures but checkers. We can talk about
core-knowledge/checks/checkers as I believe this is more generic.

Sherif: please see above.

6. Triage and Remediation Support

A crucial factor in a static code analysis tool is the support provided
in the triage process and the accuracy, effectiveness of the remediation
advice. This is vital to the speed in which the finding is assessed and
remediated by the development team.

RG: This section is talking about formats of files and findings, but not
about triage and remediation support. Triage support means: can I say that
this is a FP? Remediation support means: Does the tool provide remediation,
are they accurate or generic, can they be customized?

Sherif: Modified.

*6.1 Finding Meta-Data: *

  • The information provided together with a finding, at a minimum the tool
    should provide the following with each finding:*

    • Finding Severity: the severity of the finding with a way to change
      if required.
    • Summary: explanation of the finding and the risk it poses on
      exploit.
    • Location: the code file and the line of code where the finding is
      located
    • Taint Analysis: the flow of the tainted data until it reaches the
      finding cited location.
    • Recommendation advice: customized recommendation advice with
      details pertaining to the current finding, ideally with code examples
      written in the application’s programming language.

*RG: s/recommendation/remediation. Taint analysis is only one type of
analysis, how about the rest? It's all about evidence such as
flow-evidence, and conditions why the checker/tool thought it was an issue.
There is no standard format to report these defects, but the tool should
report as much information as it can on the defect. *

Sherif: Modified it a bit, please review.

6.2 Assessment File Management:

Assessment file management saves triage time immensely when scanning
larger applications or when a rescan is performed on an application. At a
minimum the tool should provide the following:

- *The ability to merge two assessment files*
- *The ability to diff two assessment files*
- *The ability to increment on the application’s ex-assessment file.*

*RG: This is also specific to some tools. Not all tools generate
"assessment files", so this is mostly irrelevant.  *

Sherif: Modified. Please review.

*7. Enterprise Level Support *

When making a choice on a static analysis tool in the Enterprise, an
important consideration to make is support for integration into various
systems at the Enterprise level. These systems include bug tracking
systems, systems for reporting on the risk posture of various applications,
and systems that mine the data for evaluating trending patterns.

7.2  Data Mining Capabilities Reports:

It is an important goal of any security team to be able to understand
the security trends of an organization’s applications. To meet this goal,
static analysis tools should provide the user with the ability to mine the
vulnerability data, present trends and build intelligence from it.

*RG: Shouldn't we talk more about the ability to define customized mining
capabilities and trends generation? *

Sherif: Agreed. I would think this would be a candidate for SATEC 2.0

Romain

On Wed, Nov 21, 2012 at 8:58 PM, Philippe Arteau <
philippe.arteau@gmail.com> wrote:

In index A: "A list of the frameworks and libraries used in the
organization." is mentioned. Does it refer to an external document?

I would suggest to give categories of frameworks/libraries and examples
for different languages. This would give a precise guideline to the
readers. The support for framework/api used is a crucial part.

--
Philippe Arteau


wasc-satec mailing list
wasc-satec@lists.webappsec.org
http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org

Hi Romain, Thanks for the detailed comments. Please find my replies inline. Regards, Sherif On Fri, Nov 23, 2012 at 2:44 PM, Romain Gaucher <romain@webappsec.org>wrote: > Everyone, > I had a quick look at the SATEC and commented on some sections. Sorry to > attach the comments like this, but emails aren't just great for that (*RG: > * is the start of my comment and I tried to use rich text stuff to make > it more separated from the text). I left only the parts of the SATEC I > commented on. > A general feedback is that the SATEC is really oriented toward a very > small subset of the static analysis tools available and mostly talk about > web application. I believe the effort should be taken to make this SATEC > more generic. > Sherif: I believe it is natural for the SATEC to be slightly oriented towards web applications since we are only after security natured SCA tools. Are you suggesting the SATEC to be more generic within the security oriented SCA tools or generic in the context of static code analysis in general? > Also, an area that's not touched upon is the usability of the tool: > interface it provides, workflow, integration w/ bugs management systems, > etc. > Sherif: In terms of usability, I believe we agreed to avoid any relative or hard to assess criteria, if my memory is accurate, I think this was your suggestion :) In terms of bug management, I believe this is mentioned in 7.1 > In addition, it would be good to provide links about other entities who > did evaluation criteria for static analysis tools. I find it kinda sad that > NIST SAMATE is not even mentioned once, I'm really wondering if the authors > looked at these specs ( > http://samate.nist.gov/index.php/Source_Code_Security_Analysis.html) > before writing this document. NIST also gets test suites from another > government entity (not sure if I can say who) to test for coverage. This > might be interesting to point it too. > Sherif: We should make reference for sure when it makes sense, but my question is in what context should we refer to NIST, for example should we refer to the document as "Similar projects"? Do you see a specific location in the document where we should have made a reference to NIST's? > If it's unknown to some, I'm working for a tool vendor and you could > therefore consider my take on this tainted :), but I tried to stay factual. > > Also, there are several typos in the document; copy/paste into Word should > fix this. > > Cheers, > Romain > > * > * > > *1. Platform Support:* > > *Static code analysis tools are represent a significant investment by > software organizations looking to automate parts of their software security > testing and quality assurance processes. Not only do they represent a > monetary investment, but they also demand time and effort by staff members > to setup, operate, and maintain the tool. This, in addition to checking and > acting upon the results produced by the tool. Understanding the ideal > deployment environment for the tool will maximize the return on investment > and will avoid unplanned hardware purchase cost. The following factors are > essential to understanding the tool's capabilities and hence ensuring > proper utilization of the tool which will reflect positively on > tool utilization and ensuring maximum return on investment (ROI). * > > *1.2 Scalability Support:* > > *Vendors provide various deployment options for their tools. Clear > description of the different deployment options must be provided by the > vendor to maximize the tool's usage. In addition, the vendor must specify > the optimal operating conditions. At a minimum the vendor must specify:* > > - *The type of deployment: server-side vs client-side as this might > incur hardware purchase.* > - *Ability to multi-chain several machines to achieve more scan speed.* > - *Ability to run simultaneous scans at the same time.* > > *RG: Why isn't parallelism mentioned? To achieve speed, if you can have > many threads running the different checks it's much faster. Hence the > advantages of having multi-core/cpu machines. There are multiple things to > be considered: > - Ability to analyze multiple applications on one or multiple machines > - Ability to speed up the analysis of one application* > Sherif: Makes sense, I summed all these in one bullet point: "Speed", please review and let me know if this covers it properly. > *2. Technology Support:* > > *Most organizations leverage more than one programming language within > their applications portfolio. In addition, more software frameworks are > becoming mature enough for development teams to leverage and use across the > board. In addition, to a score of 3rd party libraries, technologies, both > server and client side. Once these technologies, frameworks and libraries > are integrated into an application, they become part of it and the > application inherits any vulnerability within these components. It is vital > for the static code analysis tool to be able to understand and analyse, not > only the application, but the libraries, frameworks and technologies > supporting the application.* > > *RG: There is a big misconception here I believe. You don't want to > analyze the framework, but how the applications use the framework. It would > be ridiculous to scan the frameworks every time an application use them, > but if the static analysis do not understand the important frameworks > (control, data, and view) then it will miss most of the behavior of the > application. This is fine for some quality analysis, but security checks > are usually more global and require such understanding.* > Sherif: I don't think it was the intention at all to want to analyze the framework. The main point this section is trying to make is whether the tool leverages the framework in secure manner, and the other point is for the tool not to blank with the several layers of abstractions the frameworks often offers. However, I see your point that there are a couple of lines where it could be interpreted as we might be asking the tool to analyze the framework. I changed this section a bit, please review. > *2.1 Standard Languages Support:* > > *Most of the tools support more than one programming language. However, > an organization looking to purchase a static code analysis tool should make > an inventory of all the programming languages used inside the organizations > as well as third party applications that will be scanned as well. After > shortlisting all the programming languages, an organization should compare > the list against the tool’s supported list of programming languages.* > > *RG: Languages and versions. If you use C++11 a lot in one app, make sure > that the frontend of the analysis tool will understand it. Also, > applications such as web apps use several languages in the same app (SQL, > Java, JavaScript and HTML is a very simple stack for example); does the > tool understand all of these languages and is able to track the behavior of > the program when it passes data or call into another language? Example: > stored procedures. Is it understood where the data is actually coming from? > * > Sherif: I am not sure I understand the point you are trying to make here. > *2.2 Frameworks Support: > * > > *Once an application is built on a top of a framework, the application > inherits any vulnerability in that framework. In addition, depending on how > the application leverages a framework or a library, it can add new attack > vectors. It is very important for the tool to be able to be able to trace > tainted data through the framework as well as the custom modules built on > top of it. At large, frameworks and libraries can be classified to two > types:* > > - *Server-side Frameworks: which are the frameworks/libraries that > reside on the server, e.g. Spring, Struts, Rails, .NET etc.* > - *Mobile Frameworks: which are the frameworks that are used on mobile > devices, e.g. Android, iOS, Windows Mobile etc.* > - *Client-side Frameworks: which are the frameworks/libraries that > reside on browsers, e.g. JQuery, Prototype, etc.* > > *The tool should understand the relationship between the application and > the frameworks/libraries. Ideally, the tool would also be able to follow > tainted data between different frameworks/libraries.* > > *RG: There is a lot to be said on framework. I don't especially like the > separation between server-side, mobile, and client-side. For a static > analysis point of view, that doesn't matter so much, those are all > programs. Frameworks have interesting properties and different features. > Some will manage the data and database (ORM, etc.), some will be > responsible for the flow in the application (spring mvc, struts, .net mvc), > and some will render the view (jasper reports, asp, freemarker, asp.netpages, jinja, etc.). This is to me the important part of the framework, > understand what the framework is doing to the application.* > > *The support of framework should be tested and well defined by the tool > vendor: does it understand configuration files, etc.? What feature of the > framework doesn't it understand?* > Sherif: I think your classification makes sense. Would you be able to author this part? or write some bullet points that we could take and flesh out more? > *2.3 Industry Standards Aided Analysis: > * > > *The tool should be able to provide analysis that is tailored towards one > of the industry standard weaknesses classification, e.g. OWASP Top 10, > CWE/SANS Top 25, WASC Threat Classification, etc. This becomes a desirable > feature for many reasons. For example, an organization that just started > its application security program, a full standard scan might prove > overwhelming, especially with an extensive portfolio of applications. > Focusing on a specific industry standard in this case would be a good place > to start for that particular organization.* > > * RG: OWASP and WASC aren't weaknesses classifications. I would prefer to > see the emphasis on CWE since this is the real only one weaknesses > classification out there.* > Sherif: The main point here is that organizations are trying to set a goal and reach it. Our target audience would know OWASP and WASC but maybe not necessarily know CWE. I don't mind adding CWE for sure, but I would argue that OWASP and WASC provide organizations a goals to reach (e.g. We cover OWASP Top 10, SANS Top 25..etc) *In term of "aided analysis", I believe it's more important to talk about > the ability to enable or disable some checks based on what the > security/development team need. * > Sherif: please explain > *3. Scan, Command and Control Support* > > *The scan, command and control of static code analysis tools have a > significant influence on the user’s ability to make the best out of the > tool. This affects both the speed and effectiveness of processing findings > and remediating them.* > > *3.3 Customization:* > > *The tool usually comes with a set of signatures, this set is usually > followed by the tool to uncover the different weaknesses in the sourse > code. Static code analysis should offer a way to extend these signatures in > order to customize the tool's capabilities of detecting new weaknesses, > alter the way the tool detect weaknesses or stop the tool from detecting a > specific pattern. The tool should allow users to:* > > - *Users should be able to add/delete/modify core signatures: Core > signatures are the signatures that come bundled with the application. False > positives is one of the inherit flaws in static code analysis tools in > general. One way to minimize this problem is to optimize the tool’s > core signatures, e.g. mark a certain source as safe input.* > - *Users should be able to author custom signatures: This feature is > almost invaluable to maximize the tool’s benefits. For example, a > custom signature might be needed to “educate” the tool of the existence of > a custom cleansing module so to start flagging lines that do not use that > module or stop flagging lines that do use it.* > > *RG: Can we make this a bit more generic? Signatures or rules are just > one way of accomplishing customization. I can think of few directions:* > > *- Ability to enable/disable/modify the understanding of frameworks: > either create custom rules, checkers, or generic frameworks definition > (this construct means this stuff)* > > *- Ability to create new checkers, detect new/customized types of issues* > > *- Ability to override the core knowledge of the tool* > > *- Ability to override the core remediation advices * > Sherif: well, we gotta agree on one term, either "rule", "checker" or "signature". If we used any of the three, it will still be biased towards one tool, correct? Wikipedia is using "rules" in the article here http://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis. I would vote for the term that our target audience would probably understand. > *3.4 Scan configuration capabilities: this includes:* > > - *Ability to schedule scans: scheduled scan are often a mandatory > features. Scans are often scheduled after nightly builds, some other times > they are scheduled when the CPU usage as at its minimum. Therefore, it is > important for the user to be able to schedule the scan to run at a > particular time.* > - *Ability to view real-time status of running scans: some scans would > take hours to finish, it would be beneficial and desirable for a user to be > able to see the scan’s progress and weaknesses found thus far.* > - *Ability to save configurations and re-use them as configuration > templates: Often a significant amount of time and effort is involved in > optimally configuring a static code analyser for a particular application. > A tool should provide the user with the ability to save a scan's > configuration so that it can be re-used for later scans.* > - *Ability to run multiple scans simultaneously: Organizations that > have many applications to scan, will find the ability to run simultaneous > scans to be a desirable feature.* > - *Ability to support multiple users: this is important for > organizations which are planning to rollout the tool to be used by > developers or organizations which are planning to scan large applications > that require more than one engineer to assess at the same time.* > > *RG: How about the ability to support new compilers? * > Sherif: Please explain. > > *3.5 Testing Capabilities:* > > *Scanning an application for weaknesses is the sole most important > functionality of the tool. It is essential for the tool to be able to > understand, accurately identify and report the following attacks and > security weaknesses.* > > - *Abuse of Functionality* > - *Application Misconfiguration* > - *Auto-complete Not Disabled on Password Parameters * > - *Buffer Overflow* > - *Credential/Session Prediction* > - *Cross-site Scripting* > - *Cross-site Request Forgery* > - *Denial of Service* > - *Insecure Cryptography * > - *Format String* > - *HTTP Response Splitting* > - *Improper Input Handling* > - *Improper Output Encoding* > - *Information Leakage* > - *Insufficient Authentication* > - *Insufficient Authorization* > - *Insufficient Session Expiration* > - *Integer Overflows* > - *LDAP Injection* > - *Mail Command Injection* > - *Null Byte Injection* > - *OS Command Injection* > - *Path Traversal* > - *Remote File Inclusion* > - *Session Fixation* > - *SQL Injection* > - *URL Redirection Abuse* > - *XPATH Injection* > - *XML External Entities* > - *XML Entity Expansion* > - *XQuery Injection* > > *RG: Okay for webapps, what about the rest? Also, some are very > generic… "information leakage" what does it me to "accurately identify and > report" this? Note that this is a non solvable problem with > static analysis techniques. Also, a static analysis tool cannot report > "attacks" since it doesn't have enough information about the runtime.* > > *Generally, the testing capability should be a very large section and the > focus should be "how well are these covered?". Several open-source > tools have a large testing capability but will generate tons of FP. The > accuracy is important, and there is no real way to test for it but to > actually use the tool on one of your application and see what it finds.* > Sherif: Can you list some of the weaknesses that were missed here? > *4. Product Signature Update * > > *Product signatures is what the static code analysis tool use to identify > security weaknesses. When making a choice of a static analysis tools, one > should take into consideration the following:* > > *RG: Can we move away from "signature"? I mean this is really biased > towards some tools and some kind of analysis. If you take findbugs/clang > they don't use signatures but checkers. We can talk about > core-knowledge/checks/checkers as I believe this is more generic.* > Sherif: please see above. > > *6. Triage and Remediation Support* > > *A crucial factor in a static code analysis tool is the support provided > in the triage process and the accuracy, effectiveness of the remediation > advice. This is vital to the speed in which the finding is assessed and > remediated by the development team.* > > *RG: This section is talking about formats of files and findings, but not > about triage and remediation support. Triage support means: can I say that > this is a FP? Remediation support means: Does the tool provide remediation, > are they accurate or generic, can they be customized?* > Sherif: Modified. > *6.1 Finding Meta-Data: * > > * The information provided together with a finding, at a minimum the tool > should provide the following with each finding:* > > - *Finding Severity: the severity of the finding with a way to change > if required.* > - *Summary: explanation of the finding and the risk it poses on > exploit.* > - *Location: the code file and the line of code where the finding is > located* > - *Taint Analysis: the flow of the tainted data until it reaches the > finding cited location.* > - *Recommendation advice: customized recommendation advice with > details pertaining to the current finding, ideally with code examples > written in the application’s programming language.* > > *RG: s/recommendation/remediation. Taint analysis is only one type of > analysis, how about the rest? It's all about evidence such as > flow-evidence, and conditions why the checker/tool thought it was an issue. > There is no standard format to report these defects, but the tool should > report as much information as it can on the defect. * > Sherif: Modified it a bit, please review. > *6.2 Assessment File Management:* > > *Assessment file management saves triage time immensely when scanning > larger applications or when a rescan is performed on an application. At a > minimum the tool should provide the following:* > > - *The ability to merge two assessment files* > - *The ability to diff two assessment files* > - *The ability to increment on the application’s ex-assessment file.* > > *RG: This is also specific to some tools. Not all tools generate > "assessment files", so this is mostly irrelevant. * > Sherif: Modified. Please review. > *7. Enterprise Level Support * > > *When making a choice on a static analysis tool in the Enterprise, an > important consideration to make is support for integration into various > systems at the Enterprise level. These systems include bug tracking > systems, systems for reporting on the risk posture of various applications, > and systems that mine the data for evaluating trending patterns.* > > *7.2 Data Mining Capabilities Reports:* > > *It is an important goal of any security team to be able to understand > the security trends of an organization’s applications. To meet this goal, > static analysis tools should provide the user with the ability to mine the > vulnerability data, present trends and build intelligence from it.* > > *RG: Shouldn't we talk more about the ability to define customized mining > capabilities and trends generation? * > Sherif: Agreed. I would think this would be a candidate for SATEC 2.0 > > Romain > > > On Wed, Nov 21, 2012 at 8:58 PM, Philippe Arteau < > philippe.arteau@gmail.com> wrote: > >> In index A: "A list of the frameworks and libraries used in the >> organization." is mentioned. Does it refer to an external document? >> >> I would suggest to give categories of frameworks/libraries and examples >> for different languages. This would give a precise guideline to the >> readers. The support for framework/api used is a crucial part. >> >> -- >> Philippe Arteau >> >> _______________________________________________ >> wasc-satec mailing list >> wasc-satec@lists.webappsec.org >> http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org >> >> >
SK
Sherif Koussa
Wed, Nov 28, 2012 1:20 AM

We did.

I believe we have Alen Zuckich - Klokwork, Ory Segal - IBM, James McGovern

  • HP. Not sure whether Ory or James are working with the SCA tool teams
    though. Abraham Kang - HP also replied with some comments.

Contacted Veracode but no answer.

Sherif

On Fri, Nov 23, 2012 at 2:45 PM, Romain Gaucher romain@webappsec.orgwrote:

Btw, have we tried to reach out to tool vendors/makers to take their input
on this document? I think it's fairly important, and I'm not sure who's
working for who here...

We did. I believe we have Alen Zuckich - Klokwork, Ory Segal - IBM, James McGovern - HP. Not sure whether Ory or James are working with the SCA tool teams though. Abraham Kang - HP also replied with some comments. Contacted Veracode but no answer. Sherif On Fri, Nov 23, 2012 at 2:45 PM, Romain Gaucher <romain@webappsec.org>wrote: > Btw, have we tried to reach out to tool vendors/makers to take their input > on this document? I think it's fairly important, and I'm not sure who's > working for who here...