OWASP Benchmark Project


The OWASP Benchmark for Security Automation (OWASP Benchmark) is a free and open test suite designed to evaluate the speed, coverage, and accuracy of automated software vulnerability detection tools and services (henceforth simply referred to as 'tools'). Without the ability to measure these tools, it is difficult to understand their strengths and weaknesses, and compare them to each other. Each version of the OWASP Benchmark contains thousands of test cases that are fully runnable and exploitable, each of which maps to the appropriate CWE number for that vulnerability.

You can use the OWASP Benchmark with Static Application Security Testing (SAST) tools, Dynamic Application Security Testing (DAST) tools like OWASP ZAP and Interactive Application Security Testing (IAST) tools. The current version of the Benchmark is implemented in Java. Future versions may expand to include other languages.

The OWASP Benchmark and Hdiv

Accuracy score

Hdiv Detection (IAST) scored a 100%, which comes from a 100% true positive rate minus a 0% false positive rate.

Open video

OWASP Benchmark Scorecard


How to run the analysis of the OWASP Benchmark

If you wish to regenerate our Benchmark results, please proceed as follows.


Java, Maven and Git have to be installed in your environment.


  1. Download the OWASP Benchmark Project from Github:

        $ git clone https://github.com/OWASP/Benchmark.git
        $ cd Benchmark

  2. Edit pom.xml file and add the following lines to the cargo-maven2-plugin configuration properties:


  3. Save the file.
  4. Launch the Benchmark application and wait until it starts.
    $ ./runBenchmark.sh
  5. In another terminal, run the Crawler and wait until it completes.
    $ ./runCrawler.sh
  6. An Hdiv report file will be generated:
  7. Move the report to ./results/ directory:
    $ mv /Path-to-Hdiv-license-folder/hdivAgentLog.hlg  ./results/
  8. Create scorecards. The following command will compute a Benchmark scorecard for all the results files in the /results directory. The generated scorecard is put into the /scorecard directory.
  9. Check out the results.