wasc-satec@lists.webappsec.org

WASC Static Analysis Tool Evaluation Criteria

View all threads

Last Push - Industry Feedback Received

SK
Sherif Koussa
Tue, Mar 5, 2013 1:13 AM

Hi All,

If you were wondering where is the project at this moment. We were trying
to solicit feedback from vendors and other industry veterans, it took
longer than expected but we got a few in. The following is a list of
vendors\companies invited to comment and the ones who actually provided
feedback.

Company                  Replied
Fortify Yes  Ounce  Whitehat  Veracode Yes  Coverity  Armorize
Checkmarx Yes  Aspect  GDS  SecurityInnovation  Denim  Yes

Please follow this link to view the changes done
http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068

Please review and comment. That would be the last step before launching the
project.

Regards,
Sherif

Hi All, If you were wondering where is the project at this moment. We were trying to solicit feedback from vendors and other industry veterans, it took longer than expected but we got a few in. The following is a list of vendors\companies invited to comment and the ones who actually provided feedback. *Company Replied* Fortify Yes Ounce Whitehat Veracode Yes Coverity Armorize Checkmarx Yes Aspect GDS SecurityInnovation Denim Yes Please follow this link to view the changes done http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068 Please review and comment. That would be the last step before launching the project. Regards, Sherif
SK
Sherif Koussa
Mon, Mar 18, 2013 1:14 AM

All,

Providing feedback for the last push will be open until Friday. Please
provide your feedback before Friday, March 22nd. 2013.

Regards,
Sherif

On Mon, Mar 4, 2013 at 8:13 PM, Sherif Koussa sherif.koussa@gmail.comwrote:

Hi All,

If you were wondering where is the project at this moment. We were trying
to solicit feedback from vendors and other industry veterans, it took
longer than expected but we got a few in. The following is a list of
vendors\companies invited to comment and the ones who actually provided
feedback.

Company                  Replied
Fortify Yes  Ounce  Whitehat  Veracode Yes  Coverity  Armorize
Checkmarx Yes  Aspect  GDS  SecurityInnovation  Denim  Yes

Please follow this link to view the changes done
http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068

Please review and comment. That would be the last step before launching
the project.

Regards,
Sherif

All, Providing feedback for the last push will be open until Friday. Please provide your feedback before Friday, March 22nd. 2013. Regards, Sherif On Mon, Mar 4, 2013 at 8:13 PM, Sherif Koussa <sherif.koussa@gmail.com>wrote: > Hi All, > > If you were wondering where is the project at this moment. We were trying > to solicit feedback from vendors and other industry veterans, it took > longer than expected but we got a few in. The following is a list of > vendors\companies invited to comment and the ones who actually provided > feedback. > > *Company Replied* > Fortify Yes Ounce Whitehat Veracode Yes Coverity Armorize > Checkmarx Yes Aspect GDS SecurityInnovation Denim Yes > > > Please follow this link to view the changes done > http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068 > > Please review and comment. That would be the last step before launching > the project. > > Regards, > Sherif >
RA
Robert A.
Mon, Mar 18, 2013 4:40 PM

Remember, if you don't speak now your feedback will not be incorporated
until the next release (whenever that is).

If you think something is wrong, confusing, or missing now's your chance
to speak up!

Regards,
Robert Auger
WASC Co Founder

On Sun, 17 Mar 2013, Sherif Koussa wrote:

All,

Providing feedback for the last push will be open until Friday. Please
provide your feedback before Friday, March 22nd. 2013.

Regards,
Sherif

On Mon, Mar 4, 2013 at 8:13 PM, Sherif Koussa sherif.koussa@gmail.comwrote:

Hi All,

If you were wondering where is the project at this moment. We were trying
to solicit feedback from vendors and other industry veterans, it took
longer than expected but we got a few in. The following is a list of
vendors\companies invited to comment and the ones who actually provided
feedback.

Company                  Replied
Fortify Yes  Ounce  Whitehat  Veracode Yes  Coverity  Armorize
Checkmarx Yes  Aspect  GDS  SecurityInnovation  Denim  Yes

Please follow this link to view the changes done
http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068

Please review and comment. That would be the last step before launching
the project.

Regards,
Sherif

Remember, if you don't speak now your feedback will not be incorporated until the next release (whenever that is). If you think something is wrong, confusing, or missing now's your chance to speak up! Regards, Robert Auger WASC Co Founder On Sun, 17 Mar 2013, Sherif Koussa wrote: > All, > > Providing feedback for the last push will be open until Friday. Please > provide your feedback before Friday, March 22nd. 2013. > > Regards, > Sherif > > > On Mon, Mar 4, 2013 at 8:13 PM, Sherif Koussa <sherif.koussa@gmail.com>wrote: > >> Hi All, >> >> If you were wondering where is the project at this moment. We were trying >> to solicit feedback from vendors and other industry veterans, it took >> longer than expected but we got a few in. The following is a list of >> vendors\companies invited to comment and the ones who actually provided >> feedback. >> >> *Company Replied* >> Fortify Yes Ounce Whitehat Veracode Yes Coverity Armorize >> Checkmarx Yes Aspect GDS SecurityInnovation Denim Yes >> >> >> Please follow this link to view the changes done >> http://projects.webappsec.org/w/page-revisions/compare/41188978/Static%20Analysis%20Tool%20Evaluation%20Criteria?rev2=1362024025&rev1=1357313068 >> >> Please review and comment. That would be the last step before launching >> the project. >> >> Regards, >> Sherif >> >
MJ
McGovern, James
Mon, Mar 18, 2013 5:16 PM

Several last minute thoughts:

Under 3.6 (Testing Capabilities), do we want to enumerate out all possible tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is there an better source that we should refer to? The list doesn't seem to include injection possibilities of say Java code running on mainframes, specialized hardware, etc. Taking this a step further, should we enumerate all possible sinks?

The grammar needs fixing for 6.

We should define integration into an enterprise risk management system more deeply. Is this about GRC, risk registers, other permutations

We mention the importance of tracking from source to sink, but what about provenance? Checking whether it is possible to reach a vulnerability source form a vulnerability sink?

We need to include something about bad logging, anti-automation, etc

Several last minute thoughts: Under 3.6 (Testing Capabilities), do we want to enumerate out all possible tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is there an better source that we should refer to? The list doesn't seem to include injection possibilities of say Java code running on mainframes, specialized hardware, etc. Taking this a step further, should we enumerate all possible sinks? The grammar needs fixing for 6. We should define integration into an enterprise risk management system more deeply. Is this about GRC, risk registers, other permutations We mention the importance of tracking from source to sink, but what about provenance? Checking whether it is possible to reach a vulnerability source form a vulnerability sink? We need to include something about bad logging, anti-automation, etc
SK
Sherif Koussa
Tue, Mar 19, 2013 12:37 AM

James,

Please find my thoughts below.

Regards,
Sherif

On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James james.mcgovern@hp.comwrote:

Several last minute thoughts:

Under 3.6 (Testing Capabilities), do we want to enumerate out all possible
tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is
there an better source that we should refer to? The list doesn't seem to
include injection possibilities of say Java code running on mainframes,
specialized hardware, etc. Taking this a step further, should we enumerate
all possible sinks?

Sherif: I don't think the goal is to enumerate all the possible tests, the
current weaknesses are drawn from multiple source including Top 10, SANS 25
and WASC Threat Classification.

The grammar needs fixing for 6.

Sherif: Fixed 6. Please review.

We should define integration into an enterprise risk management system
more deeply. Is this about GRC, risk registers, other permutations

Sherif: can you suggest an alternative text please?

We mention the importance of tracking from source to sink, but what about
provenance? Checking whether it is possible to reach a vulnerability source
form a vulnerability sink?

Sherif: Not sure I understand, how is reaching a vulnerability source from
a vulnerability sink different from tracking from source to sink?

We need to include something about bad logging, anti-automation, etc

Sherif: can you elaborate please?

James, Please find my thoughts below. Regards, Sherif On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James <james.mcgovern@hp.com>wrote: > Several last minute thoughts: > > Under 3.6 (Testing Capabilities), do we want to enumerate out all possible > tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is > there an better source that we should refer to? The list doesn't seem to > include injection possibilities of say Java code running on mainframes, > specialized hardware, etc. Taking this a step further, should we enumerate > all possible sinks? > Sherif: I don't think the goal is to enumerate all the possible tests, the current weaknesses are drawn from multiple source including Top 10, SANS 25 and WASC Threat Classification. > > The grammar needs fixing for 6. > Sherif: Fixed 6. Please review. > > We should define integration into an enterprise risk management system > more deeply. Is this about GRC, risk registers, other permutations > Sherif: can you suggest an alternative text please? > > We mention the importance of tracking from source to sink, but what about > provenance? Checking whether it is possible to reach a vulnerability source > form a vulnerability sink? > Sherif: Not sure I understand, how is reaching a vulnerability source from a vulnerability sink different from tracking from source to sink? > > We need to include something about bad logging, anti-automation, etc > Sherif: can you elaborate please?
MJ
McGovern, James
Tue, Mar 19, 2013 1:22 AM

I wrote it incorrectly. There are times where you need to trace from a sink and chain backwards to the source. For example, you may care about fixing issues with certain sinks over others. A payment gateway via MQ series as a sink may be more important than say an LDAP sink that is seeing sql attacks.

For risk management, I think we should state operational risk and then break down the categories into: risk registers (about aggregation), defect tracking, analytics, GRC (value chain)

For logging, you can statically look for certain patterns related to misconfiguration, code using multiple loggers improperly, etc. This is important for defenders to have a chance after breakers have succeeded.

For anti-automation, you can add criteria such as awareness of certain libraries and their proper usage such as CAPTCHA.

From: Sherif Koussa [mailto:sherif.koussa@gmail.com]
Sent: Monday, March 18, 2013 8:37 PM
To: McGovern, James
Cc: Robert A.; wasc-satec@lists.webappsec.org
Subject: Re: [WASC-SATEC] Last Push - Industry Feedback Received

James,

Please find my thoughts below.

Regards,
Sherif

On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James <james.mcgovern@hp.commailto:james.mcgovern@hp.com> wrote:
Several last minute thoughts:

Under 3.6 (Testing Capabilities), do we want to enumerate out all possible tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is there an better source that we should refer to? The list doesn't seem to include injection possibilities of say Java code running on mainframes, specialized hardware, etc. Taking this a step further, should we enumerate all possible sinks?

Sherif: I don't think the goal is to enumerate all the possible tests, the current weaknesses are drawn from multiple source including Top 10, SANS 25 and WASC Threat Classification.

The grammar needs fixing for 6.

Sherif: Fixed 6. Please review.

We should define integration into an enterprise risk management system more deeply. Is this about GRC, risk registers, other permutations

Sherif: can you suggest an alternative text please?

We mention the importance of tracking from source to sink, but what about provenance? Checking whether it is possible to reach a vulnerability source form a vulnerability sink?

Sherif: Not sure I understand, how is reaching a vulnerability source from a vulnerability sink different from tracking from source to sink?

We need to include something about bad logging, anti-automation, etc

Sherif: can you elaborate please?

I wrote it incorrectly. There are times where you need to trace from a sink and chain backwards to the source. For example, you may care about fixing issues with certain sinks over others. A payment gateway via MQ series as a sink may be more important than say an LDAP sink that is seeing sql attacks. For risk management, I think we should state operational risk and then break down the categories into: risk registers (about aggregation), defect tracking, analytics, GRC (value chain) For logging, you can statically look for certain patterns related to misconfiguration, code using multiple loggers improperly, etc. This is important for defenders to have a chance after breakers have succeeded. For anti-automation, you can add criteria such as awareness of certain libraries and their proper usage such as CAPTCHA. From: Sherif Koussa [mailto:sherif.koussa@gmail.com] Sent: Monday, March 18, 2013 8:37 PM To: McGovern, James Cc: Robert A.; wasc-satec@lists.webappsec.org Subject: Re: [WASC-SATEC] Last Push - Industry Feedback Received James, Please find my thoughts below. Regards, Sherif On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James <james.mcgovern@hp.com<mailto:james.mcgovern@hp.com>> wrote: Several last minute thoughts: Under 3.6 (Testing Capabilities), do we want to enumerate out all possible tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is there an better source that we should refer to? The list doesn't seem to include injection possibilities of say Java code running on mainframes, specialized hardware, etc. Taking this a step further, should we enumerate all possible sinks? Sherif: I don't think the goal is to enumerate all the possible tests, the current weaknesses are drawn from multiple source including Top 10, SANS 25 and WASC Threat Classification. The grammar needs fixing for 6. Sherif: Fixed 6. Please review. We should define integration into an enterprise risk management system more deeply. Is this about GRC, risk registers, other permutations Sherif: can you suggest an alternative text please? We mention the importance of tracking from source to sink, but what about provenance? Checking whether it is possible to reach a vulnerability source form a vulnerability sink? Sherif: Not sure I understand, how is reaching a vulnerability source from a vulnerability sink different from tracking from source to sink? We need to include something about bad logging, anti-automation, etc Sherif: can you elaborate please?
SK
Sherif Koussa
Thu, Mar 21, 2013 2:21 AM

On Mon, Mar 18, 2013 at 9:22 PM, McGovern, James james.mcgovern@hp.comwrote:

I wrote it incorrectly. There are times where you need to trace from a
sink and chain backwards to the source. For example, you may care about
fixing issues with certain sinks over others. A payment gateway via MQ
series as a sink may be more important than say an LDAP sink that is seeing
sql attacks.

Sherif: Changed the text to indicate this.



For risk management, I think we should state operational risk and then
break down the categories into: risk registers (about aggregation), defect
tracking, analytics, GRC (value chain)

Sherif: Ok, I see your point. I am not sure though we need to go into
breaking down the operational risk categories. Do you think that the tool's
ability to integrate into each of the systems mentioned is critical?. I
think the evaluator who has probably one risk management tool, will ask the
question: How does your tool integrate into my tool XYZ.



For logging, you can statically look for certain patterns related to
misconfiguration, code using multiple loggers improperly, etc. This is
important for defenders to have a chance after breakers have succeeded.

Sherif: Added Insufficient/Insecure Logging to the list of weaknesses



For anti-automation, you can add criteria such as awareness of certain
libraries and their proper usage such as CAPTCHA.

Sherif: the usage of CAPTCHAs and others are really dependant on the
application's logic, I see the merits of pointing out the correct usage of
CAPTCHAs, but for me, I think the tool should focus on uncovering
weaknesses and vulnerabilities and provide better results. This for me is
like citing client-side validation script and pointing out that similar
server-side code should exist. Nice but a bit annoying when you have 25,000
findings you have to go through :)

Sherif



From: Sherif Koussa [mailto:sherif.koussa@gmail.com]
Sent: Monday, March 18, 2013 8:37 PM
To: McGovern, James
Cc: Robert A.; wasc-satec@lists.webappsec.org

Subject: Re: [WASC-SATEC] Last Push - Industry Feedback Received****


James, ****


Please find my thoughts below.****


Regards,****

Sherif****


On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James james.mcgovern@hp.com
wrote:****

Several last minute thoughts:

Under 3.6 (Testing Capabilities), do we want to enumerate out all possible
tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is
there an better source that we should refer to? The list doesn't seem to
include injection possibilities of say Java code running on mainframes,
specialized hardware, etc. Taking this a step further, should we enumerate
all possible sinks?****


Sherif: I don't think the goal is to enumerate all the possible tests, the
current weaknesses are drawn from multiple source including Top 10, SANS 25
and WASC Threat Classification. ****


The grammar needs fixing for 6.****


Sherif: Fixed 6. Please review.****


We should define integration into an enterprise risk management system
more deeply. Is this about GRC, risk registers, other permutations****


Sherif: can you suggest an alternative text please?****


We mention the importance of tracking from source to sink, but what about
provenance? Checking whether it is possible to reach a vulnerability source
form a vulnerability sink?****


Sherif: Not sure I understand, how is reaching a vulnerability source from
a vulnerability sink different from tracking from source to sink?****


We need to include something about bad logging, anti-automation, etc****


Sherif: can you elaborate please? ****


On Mon, Mar 18, 2013 at 9:22 PM, McGovern, James <james.mcgovern@hp.com>wrote: > I wrote it incorrectly. There are times where you need to trace from a > sink and chain backwards to the source. For example, you may care about > fixing issues with certain sinks over others. A payment gateway via MQ > series as a sink may be more important than say an LDAP sink that is seeing > sql attacks. > Sherif: Changed the text to indicate this. > **** > > ** ** > > For risk management, I think we should state operational risk and then > break down the categories into: risk registers (about aggregation), defect > tracking, analytics, GRC (value chain) > Sherif: Ok, I see your point. I am not sure though we need to go into breaking down the operational risk categories. Do you think that the tool's ability to integrate into each of the systems mentioned is critical?. I think the evaluator who has probably one risk management tool, will ask the question: How does your tool integrate into my tool XYZ. > **** > > ** ** > > For logging, you can statically look for certain patterns related to > misconfiguration, code using multiple loggers improperly, etc. This is > important for defenders to have a chance after breakers have succeeded. > Sherif: Added Insufficient/Insecure Logging to the list of weaknesses > **** > > ** ** > > For anti-automation, you can add criteria such as awareness of certain > libraries and their proper usage such as CAPTCHA. > Sherif: the usage of CAPTCHAs and others are really dependant on the application's logic, I see the merits of pointing out the correct usage of CAPTCHAs, but for me, I think the tool should focus on uncovering weaknesses and vulnerabilities and provide better results. This for me is like citing client-side validation script and pointing out that similar server-side code should exist. Nice but a bit annoying when you have 25,000 findings you have to go through :) Sherif > **** > > ** ** > > *From:* Sherif Koussa [mailto:sherif.koussa@gmail.com] > *Sent:* Monday, March 18, 2013 8:37 PM > *To:* McGovern, James > *Cc:* Robert A.; wasc-satec@lists.webappsec.org > > *Subject:* Re: [WASC-SATEC] Last Push - Industry Feedback Received**** > > ** ** > > James, **** > > ** ** > > Please find my thoughts below.**** > > ** ** > > Regards,**** > > Sherif**** > > ** ** > > On Mon, Mar 18, 2013 at 1:16 PM, McGovern, James <james.mcgovern@hp.com> > wrote:**** > > Several last minute thoughts: > > Under 3.6 (Testing Capabilities), do we want to enumerate out all possible > tests? If so, lets add in: MQ Injection, JMS Injection, etc. If not, is > there an better source that we should refer to? The list doesn't seem to > include injection possibilities of say Java code running on mainframes, > specialized hardware, etc. Taking this a step further, should we enumerate > all possible sinks?**** > > ** ** > > Sherif: I don't think the goal is to enumerate all the possible tests, the > current weaknesses are drawn from multiple source including Top 10, SANS 25 > and WASC Threat Classification. **** > > **** > > > The grammar needs fixing for 6.**** > > ** ** > > Sherif: Fixed 6. Please review.**** > > **** > > > We should define integration into an enterprise risk management system > more deeply. Is this about GRC, risk registers, other permutations**** > > ** ** > > Sherif: can you suggest an alternative text please?**** > > **** > > > We mention the importance of tracking from source to sink, but what about > provenance? Checking whether it is possible to reach a vulnerability source > form a vulnerability sink?**** > > ** ** > > Sherif: Not sure I understand, how is reaching a vulnerability source from > a vulnerability sink different from tracking from source to sink?**** > > **** > > > We need to include something about bad logging, anti-automation, etc**** > > ** ** > > Sherif: can you elaborate please? **** > > ** ** >