wasc-satec@lists.webappsec.org

WASC Static Analysis Tool Evaluation Criteria

View all threads

Re: [WASC-SATEC] Sub-Categories Voting Now Open - Deadline Extended Until Sept 12th

SK
Sherif Koussa
Mon, Aug 22, 2011 7:11 PM

Looks like enough people asked for an extension for the deadline for the
voting on sub-categories, extending it to Sept 12th

Regards,
Sherif

On Mon, Aug 22, 2011 at 3:01 PM, Alec Shcherbakov <
alec.shcherbakov@astechconsulting.com> wrote:

I agree with Daniel – we definitely need more time as this step requires
quite a lot of thinking. ****

How about moving the ETA to September 10th ?****




  • Alec****

From: wasc-satec-bounces@lists.webappsec.org [mailto:
wasc-satec-bounces@lists.webappsec.org] On Behalf Of Daniel Medianero
Sent: Monday, August 22, 2011 3:32 AM
To: wasc-satec@lists.webappsec.org
Subject: [WASC-SATEC] Sub-Categories Voting Now Open
**


Hi,



First of all, I think we should extend the time to vote, the August 26 is
too early. ****


1. Tool Setup and Installation****

ADD: Types of installation (appliance, software, cloud, webservice, etc)



2. Performing a Scan****

2.1 Time required to perform a scan. - EDIT: I think they are two
different concepts: ****

      1. Time for the user to execute an analysis (how to upload the

code, if it is possible to scheduled analysis, etc)****

      2. Time to take the tool to perform an analysis.****

2.2 Number of steps required to perform a scan - EDIT: I think Instead
of number of steps we should consider types of methodologies to carry out an
analysis (from the typical code audit through the integration possibilities
in the development cycle, automated analysis, etc)****


4. Detection Accuracy****

EDIT all subpoints:****

There is not an absolute measure that can contribute in this regard. It
would be desirable to take into account projects as SATE (
http://samate.nist.gov/SATE.html) or include a set of public specific
cases and see how they behave tools.****


5. Triage and Remediation Process****

5.1 Average time to triage a finding - REMOVE: It is no possible to
measure this point in absolute terms because it depends on each customer /
users involved in the process and external conditions (infrastructure,
development team organization, etc.)****


5.5 Ability to merge assessments - REMOVE: It makes little sense because
if two code analysis at various stages of development should not be mixed
results, and if the same code, vulnerabilities should be detected in both
analysis****


6. UI Simplicity and Intuitiveness****

ADD: Ability of various users work in the same code/proyect.**

ADD: Possible hierarchies in the tool users (roles, permissions, etc)***
*


7. Product Update Process****

7.1 Frequency of signature update - REMOVE: I believe that the frequency
itself as data is irrelevant, it is not possible to measure this as
anti-virus updates because it is not follows the same pattern****



8. Product Maturity and Scalability****

8.1 Peak memory usage -  EDIT: This should be there in the section "1.
Tool Setup and Installation", I think should be a section like "hardware or
software requirements"****


8.2 Number of scans done before a crash or serious degradation in
performance - EDIT: I think should be like "number of scans
simultaneous"****


8.4 What languages does the tool support? - REMOVE: This is the same of
3.1


10. Reporting Capabilities****

10.1 Quality of reports - REMOVE: It is more difficult to measure this
in absolute terms****



Kind regards !!****


-- ****

*"Los intelectuales resuelven los problemas; los genios, los evitan".
Albert Einstein
  *****


*Daniel Medianero  *****

Security Consultant****

dmedianero dmedianero@buguroo.com@buguroo.com dmedianero@buguroo.com






T.:  +34 91 781 61 60****

M.: +34 638 56 29 15
F.:  +34 91 578 38 79
Plz. Del Marqués de Salamanca, 3-4
28006 Madrid****

www.buguroo.com****



wasc-satec mailing list
wasc-satec@lists.webappsec.org
http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org

Looks like enough people asked for an extension for the deadline for the voting on sub-categories, extending it to Sept 12th Regards, Sherif On Mon, Aug 22, 2011 at 3:01 PM, Alec Shcherbakov < alec.shcherbakov@astechconsulting.com> wrote: > I agree with Daniel – we definitely need more time as this step requires > quite a lot of thinking. **** > > How about moving the ETA to September 10th ?**** > > ** ** > > ** ** > > ** ** > > - Alec**** > > ** ** > > *From:* wasc-satec-bounces@lists.webappsec.org [mailto: > wasc-satec-bounces@lists.webappsec.org] *On Behalf Of *Daniel Medianero > *Sent:* Monday, August 22, 2011 3:32 AM > *To:* wasc-satec@lists.webappsec.org > *Subject:* [WASC-SATEC] Sub-Categories Voting Now Open**** > > ** ** > > Hi, > **** > > ** ** > > First of all, I think we should extend the time to vote, the August 26 is > too early. **** > > ** ** > > *1. Tool Setup and Installation***** > > *ADD*: Types of installation (appliance, software, cloud, webservice, etc) > **** > > **** > > *2. Performing a Scan***** > > 2.1 Time required to perform a scan. - *EDIT*: I think they are two > different concepts: **** > > 1. Time for the user to execute an analysis (how to upload the > code, if it is possible to scheduled analysis, etc)**** > > 2. Time to take the tool to perform an analysis.**** > > ** ** > > 2.2 Number of steps required to perform a scan - *EDIT*: I think Instead > of number of steps we should consider types of methodologies to carry out an > analysis (from the typical code audit through the integration possibilities > in the development cycle, automated analysis, etc)**** > > **** > > *4. Detection Accuracy***** > > *EDIT all subpoints*:**** > > There is not an absolute measure that can contribute in this regard. It > would be desirable to take into account projects as SATE ( > http://samate.nist.gov/SATE.html) or include a set of public specific > cases and see how they behave tools.**** > > **** > > *5. Triage and Remediation Process***** > > 5.1 Average time to triage a finding - *REMOVE*: It is no possible to > measure this point in absolute terms because it depends on each customer / > users involved in the process and external conditions (infrastructure, > development team organization, etc.)**** > > ** ** > > 5.5 Ability to merge assessments - *REMOVE*: It makes little sense because > if two code analysis at various stages of development should not be mixed > results, and if the same code, vulnerabilities should be detected in both > analysis**** > > ** ** > > *6. UI Simplicity and Intuitiveness***** > > *ADD: *Ability of various users work in the same code/proyect.**** > > *ADD*: Possible hierarchies in the tool users (roles, permissions, etc)*** > * > > **** > > *7. Product Update Process***** > > 7.1 Frequency of signature update - *REMOVE*: I believe that the frequency > itself as data is irrelevant, it is not possible to measure this as > anti-virus updates because it is not follows the same pattern**** > > ** ** > > ** ** > > *8. Product Maturity and Scalability***** > > 8.1 Peak memory usage - *EDIT*: This should be there in the section "1. > Tool Setup and Installation", I think should be a section like "hardware or > software requirements"**** > > ** ** > > 8.2 Number of scans done before a crash or serious degradation in > performance - *EDIT*: I think should be like "number of scans > simultaneous"**** > > ** ** > > 8.4 What languages does the tool support? - *REMOVE*: This is the same of > 3.1 > **** > > *10. Reporting Capabilities***** > > 10.1 Quality of reports - *REMOVE*: It is more difficult to measure this > in absolute terms**** > > ** ** > > ** ** > > Kind regards !!**** > > ** ** > > -- **** > > *"*Los intelectuales resuelven los problemas; los genios, los evitan". > Albert Einstein* ***** > > ** ** > > *Daniel Medianero ***** > > *Security Consultant***** > > *dmedianero <dmedianero@buguroo.com>@buguroo.com <dmedianero@buguroo.com>* > **** > > **** > > **** > > **** > > **** > > T.: +34 91 781 61 60**** > > M.: +34 638 56 29 15 > F.: +34 91 578 38 79 > Plz. Del Marqués de Salamanca, 3-4 > 28006 Madrid**** > > www.buguroo.com**** > > ** ** > > _______________________________________________ > wasc-satec mailing list > wasc-satec@lists.webappsec.org > http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org > >
SK
Sherif Koussa
Mon, Sep 5, 2011 2:50 AM

Hi All,

This is a reminder that the deadline for voting for sub-categories is a week
away. If you haven't yet voted, please take the time to do so.

Regards,
Sherif

On Mon, Aug 22, 2011 at 3:11 PM, Sherif Koussa sherif.koussa@gmail.comwrote:

Looks like enough people asked for an extension for the deadline for the
voting on sub-categories, extending it to Sept 12th

Regards,
Sherif

On Mon, Aug 22, 2011 at 3:01 PM, Alec Shcherbakov <
alec.shcherbakov@astechconsulting.com> wrote:

I agree with Daniel – we definitely need more time as this step requires
quite a lot of thinking. ****

How about moving the ETA to September 10th ?****




  • Alec****

From: wasc-satec-bounces@lists.webappsec.org [mailto:
wasc-satec-bounces@lists.webappsec.org] On Behalf Of Daniel Medianero
Sent: Monday, August 22, 2011 3:32 AM
To: wasc-satec@lists.webappsec.org
Subject: [WASC-SATEC] Sub-Categories Voting Now Open
**


Hi,



First of all, I think we should extend the time to vote, the August 26 is
too early. ****


1. Tool Setup and Installation****

ADD: Types of installation (appliance, software, cloud, webservice,
etc)****


2. Performing a Scan****

2.1 Time required to perform a scan. - EDIT: I think they are two
different concepts: ****

      1. Time for the user to execute an analysis (how to upload the

code, if it is possible to scheduled analysis, etc)****

      2. Time to take the tool to perform an analysis.****

2.2 Number of steps required to perform a scan - EDIT: I think Instead
of number of steps we should consider types of methodologies to carry out an
analysis (from the typical code audit through the integration possibilities
in the development cycle, automated analysis, etc)****


4. Detection Accuracy****

EDIT all subpoints:****

There is not an absolute measure that can contribute in this regard. It
would be desirable to take into account projects as SATE (
http://samate.nist.gov/SATE.html) or include a set of public specific
cases and see how they behave tools.****


5. Triage and Remediation Process****

5.1 Average time to triage a finding - REMOVE: It is no possible to
measure this point in absolute terms because it depends on each customer /
users involved in the process and external conditions (infrastructure,
development team organization, etc.)****


5.5 Ability to merge assessments - REMOVE: It makes little sense
because if two code analysis at various stages of development should not be
mixed results, and if the same code, vulnerabilities should be detected in
both analysis****


6. UI Simplicity and Intuitiveness****

ADD: Ability of various users work in the same code/proyect.**

ADD: Possible hierarchies in the tool users (roles, permissions, etc)**
**


7. Product Update Process****

7.1 Frequency of signature update - REMOVE: I believe that the
frequency itself as data is irrelevant, it is not possible to measure this
as anti-virus updates because it is not follows the same pattern****



8. Product Maturity and Scalability****

8.1 Peak memory usage -  EDIT: This should be there in the section "1.
Tool Setup and Installation", I think should be a section like "hardware or
software requirements"****


8.2 Number of scans done before a crash or serious degradation in
performance - EDIT: I think should be like "number of scans
simultaneous"****


8.4 What languages does the tool support? - REMOVE: This is the same of
3.1


10. Reporting Capabilities****

10.1 Quality of reports - REMOVE: It is more difficult to measure this
in absolute terms****



Kind regards !!****


-- ****

*"Los intelectuales resuelven los problemas; los genios, los evitan".
Albert Einstein
  *****


*Daniel Medianero  *****

Security Consultant****

*dmedianero dmedianero@buguroo.com@buguroo.com dmedianero@buguroo.com






T.:  +34 91 781 61 60****

M.: +34 638 56 29 15
F.:  +34 91 578 38 79
Plz. Del Marqués de Salamanca, 3-4
28006 Madrid****

www.buguroo.com****



wasc-satec mailing list
wasc-satec@lists.webappsec.org
http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org

Hi All, This is a reminder that the deadline for voting for sub-categories is a week away. If you haven't yet voted, please take the time to do so. Regards, Sherif On Mon, Aug 22, 2011 at 3:11 PM, Sherif Koussa <sherif.koussa@gmail.com>wrote: > Looks like enough people asked for an extension for the deadline for the > voting on sub-categories, extending it to Sept 12th > > Regards, > Sherif > > On Mon, Aug 22, 2011 at 3:01 PM, Alec Shcherbakov < > alec.shcherbakov@astechconsulting.com> wrote: > >> I agree with Daniel – we definitely need more time as this step requires >> quite a lot of thinking. **** >> >> How about moving the ETA to September 10th ?**** >> >> ** ** >> >> ** ** >> >> ** ** >> >> - Alec**** >> >> ** ** >> >> *From:* wasc-satec-bounces@lists.webappsec.org [mailto: >> wasc-satec-bounces@lists.webappsec.org] *On Behalf Of *Daniel Medianero >> *Sent:* Monday, August 22, 2011 3:32 AM >> *To:* wasc-satec@lists.webappsec.org >> *Subject:* [WASC-SATEC] Sub-Categories Voting Now Open**** >> >> ** ** >> >> Hi, >> **** >> >> ** ** >> >> First of all, I think we should extend the time to vote, the August 26 is >> too early. **** >> >> ** ** >> >> *1. Tool Setup and Installation***** >> >> *ADD*: Types of installation (appliance, software, cloud, webservice, >> etc)**** >> >> **** >> >> *2. Performing a Scan***** >> >> 2.1 Time required to perform a scan. - *EDIT*: I think they are two >> different concepts: **** >> >> 1. Time for the user to execute an analysis (how to upload the >> code, if it is possible to scheduled analysis, etc)**** >> >> 2. Time to take the tool to perform an analysis.**** >> >> ** ** >> >> 2.2 Number of steps required to perform a scan - *EDIT*: I think Instead >> of number of steps we should consider types of methodologies to carry out an >> analysis (from the typical code audit through the integration possibilities >> in the development cycle, automated analysis, etc)**** >> >> **** >> >> *4. Detection Accuracy***** >> >> *EDIT all subpoints*:**** >> >> There is not an absolute measure that can contribute in this regard. It >> would be desirable to take into account projects as SATE ( >> http://samate.nist.gov/SATE.html) or include a set of public specific >> cases and see how they behave tools.**** >> >> **** >> >> *5. Triage and Remediation Process***** >> >> 5.1 Average time to triage a finding - *REMOVE*: It is no possible to >> measure this point in absolute terms because it depends on each customer / >> users involved in the process and external conditions (infrastructure, >> development team organization, etc.)**** >> >> ** ** >> >> 5.5 Ability to merge assessments - *REMOVE*: It makes little sense >> because if two code analysis at various stages of development should not be >> mixed results, and if the same code, vulnerabilities should be detected in >> both analysis**** >> >> ** ** >> >> *6. UI Simplicity and Intuitiveness***** >> >> *ADD: *Ability of various users work in the same code/proyect.**** >> >> *ADD*: Possible hierarchies in the tool users (roles, permissions, etc)** >> ** >> >> **** >> >> *7. Product Update Process***** >> >> 7.1 Frequency of signature update - *REMOVE*: I believe that the >> frequency itself as data is irrelevant, it is not possible to measure this >> as anti-virus updates because it is not follows the same pattern**** >> >> ** ** >> >> ** ** >> >> *8. Product Maturity and Scalability***** >> >> 8.1 Peak memory usage - *EDIT*: This should be there in the section "1. >> Tool Setup and Installation", I think should be a section like "hardware or >> software requirements"**** >> >> ** ** >> >> 8.2 Number of scans done before a crash or serious degradation in >> performance - *EDIT*: I think should be like "number of scans >> simultaneous"**** >> >> ** ** >> >> 8.4 What languages does the tool support? - *REMOVE*: This is the same of >> 3.1 >> **** >> >> *10. Reporting Capabilities***** >> >> 10.1 Quality of reports - *REMOVE*: It is more difficult to measure this >> in absolute terms**** >> >> ** ** >> >> ** ** >> >> Kind regards !!**** >> >> ** ** >> >> -- **** >> >> *"*Los intelectuales resuelven los problemas; los genios, los evitan". >> Albert Einstein* ***** >> >> ** ** >> >> *Daniel Medianero ***** >> >> *Security Consultant***** >> >> *dmedianero <dmedianero@buguroo.com>@buguroo.com <dmedianero@buguroo.com> >> ***** >> >> **** >> >> **** >> >> **** >> >> **** >> >> T.: +34 91 781 61 60**** >> >> M.: +34 638 56 29 15 >> F.: +34 91 578 38 79 >> Plz. Del Marqués de Salamanca, 3-4 >> 28006 Madrid**** >> >> www.buguroo.com**** >> >> ** ** >> >> _______________________________________________ >> wasc-satec mailing list >> wasc-satec@lists.webappsec.org >> http://lists.webappsec.org/mailman/listinfo/wasc-satec_lists.webappsec.org >> >> >