Skip to main content

A lean CLI tool for normalizing security scanner findings based on DefectDojo parsers.

Project description

norm-findings

A lean CLI tool for normalizing security scanner findings based on DefectDojo parsers.

This project provides a standalone Python package and a minimal Docker image to convert findings from O(100) security scanners into a normalized format.

Open Source Attribution

This project is based on the excellent work of the DefectDojo community. We leverage their parser logic while providing a lean, dependency-minimized execution environment. See the NOTICE file for more details.

Installation

The default installation includes the core CLI and all parser dependencies, providing full functionality out-of-the-box.

Standard (Core + Parsers)

pip install .

Optional: Server Support

If you need the REST API server, install the server extra:

pip install ".[server]"

Optional: Development

For running tests or contributing:

pip install ".[dev]"

Running Tests

Unit Tests

Verify the core installation and stubs:

pytest tests/test_cli.py

E2E Parser Verification (Development only)

To verify all 200+ parsers against real DefectDojo sample data:

  1. Ensure the development dependencies are installed (pip install ".[dev]").
  2. Run the updater to fetch sample data:
    python -m norm_findings.updater
    
  3. Run the E2E tests:
    pytest tests/test_e2e.py
    

Usage

norm-findings supports 200+ security scanners and accepts parser names in a case-insensitive manner, with or without the "Parser" suffix.

CLI

The CLI supports all parsers generically - just specify the parser name (case-insensitive):

# Basic usage - parser names are case-insensitive
norm-findings convert --parser trivy --input-file trivy.json --output-file findings.json

# Works with any case
norm-findings convert --parser TRIVY --input-file trivy.json --output-file findings.json
norm-findings convert --parser Trivy --input-file trivy.json --output-file findings.json

# Works with or without "Parser" suffix
norm-findings convert --parser TrivyParser --input-file trivy.json --output-file findings.json
norm-findings convert --parser anchore_grype --input-file grype.json --output-file findings.json

# With custom test label
norm-findings convert --parser bandit --input-file bandit.json --output-file findings.json --test "My App v1.0"

# Print findings to console
norm-findings convert --parser semgrep --input-file semgrep.json --output-file findings.json --print

Available parsers (200+): trivy, anchore_grype, bandit, semgrep, snyk, checkmarx, fortify, zap, burp, aqua, blackduck, sonarqube, veracode, and many more. See parser_mapping.json for the complete list.

Docker

# Case-insensitive parser names work in Docker too
docker run -v $(pwd):/dojo -it ghcr.io/scribe-security/norm-findings:latest \
  convert --parser trivy --input-file /dojo/trivy.json --output-file /dojo/findings.json

docker run -v $(pwd):/dojo -it ghcr.io/scribe-security/norm-findings:latest \
  convert --parser ANCHORE_GRYPE --input-file /dojo/grype.json --output-file /dojo/findings.json

Using as a Library (API)

The recommended way to use norm-findings as a library is through the high-level API, which handles parser resolution automatically:

from norm_findings.api import parse_findings

# Parse findings - parser name is case-insensitive
findings = parse_findings(
    parser_name="trivy",  # Case-insensitive: "trivy", "TRIVY", "Trivy" all work
    input_file="trivy.json",
    test_label="My Application v1.0",
    output_file="findings.json"  # Optional: writes findings to file
)

# Process findings
for finding in findings:
    print(f"{finding.severity}: {finding.title}")
    if finding.description:
        print(f"  Description: {finding.description}")

More API Examples:

# Example 1: Parse without writing to file
findings = parse_findings(
    parser_name="anchore_grype",
    input_file="grype-results.json",
    test_label="Production Scan"
)
print(f"Found {len(findings)} vulnerabilities")

# Example 2: Case-insensitive parser names
findings = parse_findings(
    parser_name="BANDIT",  # Upper case works too
    input_file="bandit-output.json",
    test_label="Security Audit"
)

# Example 3: With or without "Parser" suffix
findings = parse_findings(
    parser_name="SemgrepParser",  # "Parser" suffix is optional
    input_file="semgrep.json",
    test_label="Code Analysis"
)

# Example 4: Error handling
try:
    findings = parse_findings(
        parser_name="my_scanner",
        input_file="results.json"
    )
except ValueError as e:
    print(f"Parser error: {e}")
    # Error message includes list of available parsers

Advanced: Direct Parser Usage

If you need more control, you can instantiate parsers directly:

from norm_findings.parsers.trivy.parser import TrivyParser
from norm_findings.stubs.models import Test

parser = TrivyParser()
test = Test(product="My Application")

with open("trivy.json", "rb") as f:
    findings = parser.get_findings(f, test)

for finding in findings:
    print(f"Found: {finding.title} ({finding.severity})")

Legacy Version

The original monkey-patched version of this tool is preserved in the legacy-monkeypatch branch and tagged as v1.x-legacy.

To use the legacy version:

git checkout v1.x-legacy

Automatic Updates

norm-findings includes a built-in updater that fetches the latest parsers and tests from DefectDojo:

python -m norm_findings.updater

Development

Workflow

  1. Branching: Create a new branch for your feature or bugfix from main.
  2. Syncing Parsers: Run the updater to ensure you have the latest DefectDojo parsers:
    python -m norm_findings.updater
    
  3. Testing: Always run the test suite before pushing:
    pytest tests/test_cli.py
    pytest tests/test_e2e.py --ignore norm_findings/stubs/models.py
    
  4. Pushing: Push your branch to GitHub and open a Pull Request.

Versioning

norm-findings uses setuptools-scm for automatic versioning.

  • The version is automatically derived from the most recent Git tag.
  • When working on local uncommitted changes, the version will include a .dev suffix and the current timestamp.
  • The version is written to norm_findings/_version.py during the build process.

Releasing

Releases are automated via GitHub Actions and are triggered by pushing a version tag:

  1. Create a tag: Create a semantic version tag starting with v (e.g., v1.1.0):
    git tag -a v1.1.0 -m "Release version 1.1.0"
    
  2. Push the tag:
    git push origin v1.1.0
    
  3. Automated Pipeline: The build workflow will automatically:
    • Run all tests.
    • Build the Python wheel and source distribution.
    • Publish to PyPI.
    • Build and push the Docker image to GHCR (tagged with the version and latest).

Automatic Parser Updates

A daily GitHub Action runs the updater.py logic. If new parsers or updates are detected in DefectDojo:

  1. A new branch auto-update-parsers is created.
  2. A Pull Request is opened with a summary of the changes.
  3. Maintainers can review and merge the PR to keep norm-findings up-to-date.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

norm_findings-0.4.1.tar.gz (328.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

norm_findings-0.4.1-py3-none-any.whl (529.9 kB view details)

Uploaded Python 3

File details

Details for the file norm_findings-0.4.1.tar.gz.

File metadata

  • Download URL: norm_findings-0.4.1.tar.gz
  • Upload date:
  • Size: 328.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for norm_findings-0.4.1.tar.gz
Algorithm Hash digest
SHA256 d92ebdbf17273a0368539da8ada533d99fb2ccfd1890b93fb5907f6d40395396
MD5 2e10830b5f8a51c1d01df09d75820b82
BLAKE2b-256 7a14eb77f6c29d30cc0e1cec898ee03c23b7032c233ee141d060f70b680b21fd

See more details on using hashes here.

File details

Details for the file norm_findings-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: norm_findings-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 529.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for norm_findings-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4a8e768965e10489d5c5c1ef6dedc69aa3341a8f46c3f427a62cfc22c9d9a933
MD5 8560999aaa999850f3f7b557da8c0576
BLAKE2b-256 5317471e83c3f6b884951ff13be1ffcc4cab5e31360b26a6ebffdde48af5b0a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page