A utility to (re-)import findings and language data into DefectDojo
Project description
dd-import
A utility to (re-)import findings and language data into DefectDojo
Findings and languages can be imported into DefectDojo via an API. To make automated build and deploy pipelines easier to implement, dd-import
provides some convenience functions:
- Product types, products, engagements and tests will be created if they are not existing. This avoids manual preparation in DefectDojo or complicated steps within the pipeline.
- Product types, products, engagements and tests are referenced by name. This make pipelines more readable than using IDs.
- Build information for
build_id
,commit_hash
andbranch_tag
can be updated when uploading findings. - No need to deal with
curl
and its syntax within the pipeline. This makes pipelines shorter and better readable. - All parameters are provided via environment variables, which works well with pipeline definitions like GitHub Actions or GitLab CI.
User guide
Installation and commands
Python
dd-import
can be installed with pip. Only Python 3.8 and up is supported.
pip install dd-import
The command dd-reimport-findings
re-imports findings into DefectDojo. Even though the name suggests otherwise, you do not need to do an initial import first.
The command dd-import-languages
imports languages data that have been gathered with the tool cloc, see Languages and lines of code for more details.
Docker
Docker images can be found in https://hub.docker.com/r/maibornwolff/dd-import.
A re-import of findings can be started with
docker run --rm dd-import:latest dd-reimport-findings.sh
Importing languages data can be started with
docker run --rm dd-import:latest dd-import-languages.sh
Please note you have to set the environment variables as described below and mount a folder containing the file with scan results when running the docker container.
/usr/local/dd-import
is the working directory of the docker image, all commands are located in the /usr/local/dd-import/bin
folder.
Parameters
All parameters need to be provided as environment variables:
Parameter | Re-import findings | Import languages | Remark |
---|---|---|---|
DD_URL | Mandatory | Mandatory | Base URL of the DefectDojo instance |
DD_API_KEY | Mandatory | Mandatory | Shall be defined as a secret, eg. a protected variable in GitLab or an encrypted secret in GitHub |
DD_PRODUCT_TYPE_NAME | Mandatory | Mandatory | If a product type with this name does not exist, it will be created |
DD_PRODUCT_NAME | Mandatory | Mandatory | If a product with this name does not exist, it will be created |
DD_ENGAGEMENT_NAME | Mandatory | - | If an engagement with this name does not exist for the given product, it will be created |
DD_ENGAGEMENT_TARGET_START | Optional | - | Format: YYYY-MM-DD, default: today . The target start date for a newly created engagement. |
DD_ENGAGEMENT_TARGET_END | Optional | - | Format: YYYY-MM-DD, default: 2999-12-31 . The target start date for a newly created engagement. |
DD_TEST_NAME | Mandatory | - | If a test with this name does not exist for the given engagement, it will be created |
DD_TEST_TYPE_NAME | Mandatory | - | From DefectDojo's list of test types, eg. Trivy Scan |
DD_FILE_NAME | Optional | Mandatory | |
DD_ACTIVE | Optional | - | Default: true |
DD_VERIFIED | Optional | - | Default: true |
DD_MINIMUM_SEVERITY | Optional | - | |
DD_GROUP_BY | Optional | - | Group by file path, component name, component name + version |
DD_PUSH_TO_JIRA | Optional | - | Default: false |
DD_CLOSE_OLD_FINDINGS | Optional | - | Default: true |
DD_CLOSE_OLD_FINDINGS_PRODUCT_SCOPE | Optional | - | Default: false |
DD_DO_NOT_REACTIVATE | Optional | - | Default: false |
DD_VERSION | Optional | - | |
DD_ENDPOINT_ID | Optional | - | |
DD_SERVICE | Optional | - | |
DD_BUILD_ID | Optional | - | |
DD_COMMIT_HASH | Optional | - | |
DD_BRANCH_TAG | Optional | - | |
DD_API_SCAN_CONFIGURATION_ID | Optional | - | Id of the API scan configuration for API based parsers, e.g. SonarQube |
DD_SOURCE_CODE_MANAGEMENT_URI | Optional | - | |
DD_SSL_VERIFY | Optional | Optional | Disable SSL verification by setting to false or 0 . Default: true |
DD_EXTRA_HEADER_1 | Optional | Optional | If extra header key is needed for auth in wafs or similar |
DD_EXTRA_HEADER_1_VALUE | Optional | Optional | The corresponding value for extra header key |
DD_EXTRA_HEADER_2 | Optional | Optional | If extra header key is needed for auth in wafs or similar |
DD_EXTRA_HEADER_2_VALUE | Optional | Optional | The corresponding value for extra header key |
Usage
This snippet from a GitLab CI pipeline serves as an example how dd-import
can be integrated to upload data during build and deploy using the docker image:
variables:
DD_PRODUCT_TYPE_NAME: "Showcase"
DD_PRODUCT_NAME: "DefectDojo Importer"
DD_ENGAGEMENT_NAME: "GitLab"
...
trivy:
stage: test
tags:
- build
variables:
GIT_STRATEGY: none
before_script:
- export TRIVY_VERSION=$(wget -qO - "https://api.github.com/repos/aquasecurity/trivy/releases/latest" | grep '"tag_name":' | sed -E 's/.*"v([^"]+)".*/\1/')
- echo $TRIVY_VERSION
- wget --no-verbose https://github.com/aquasecurity/trivy/releases/download/v${TRIVY_VERSION}/trivy_${TRIVY_VERSION}_Linux-64bit.tar.gz -O - | tar -zxvf -
allow_failure: true
script:
- ./trivy --exit-code 0 --no-progress -f json -o trivy.json maibornwolff/dd-import:latest
artifacts:
paths:
- trivy.json
when: always
expire_in: 1 day
cloc:
stage: test
image: node:16
tags:
- build
before_script:
- npm install -g cloc
script:
- cloc src --json -out cloc.json
artifacts:
paths:
- cloc.json
when: always
expire_in: 1 day
upload_trivy:
stage: upload
image: maibornwolff/dd-import:latest
needs:
- job: trivy
artifacts: true
variables:
GIT_STRATEGY: none
DD_TEST_NAME: "Trivy"
DD_TEST_TYPE_NAME: "Trivy Scan"
DD_FILE_NAME: "trivy.json"
script:
- dd-reimport-findings.sh
upload-cloc:
image: maibornwolff/dd-import:latest
needs:
- job: cloc
artifacts: true
stage: upload
tags:
- build
variables:
DD_FILE_NAME: "cloc.json"
script:
- dd-import-languages.sh
- variables - Definition of some environment variables that will be used for several uploads.
DD_URL
andDD_API_KEY
are not defined here because they are protected variables for the GitLab project. - trivy - Example for a vulnerability scan with trivy. Output will be stored in JSON format (
trivy.json
). - cloc - Example how to calculate the lines of code with cloc. Output will be stored in JSON format (
cloc.json
). - upload_trivy - This step will be executed after the
trivy
step, gets its output file and sets some variables specific for this step. Then the script to import the findings from this scan is executed. - upload_cloc - This step will be executed after the
cloc
step, gets its output file and sets some variables specific for this step. Then the script to import the language data is executed.
Another example, showing how to use dd-import
within a GitHub Action, can be found in dd-import_example.yml.
Developer guide
Testing
./bin/runUnitTests.sh
- Runs the unit tests and reports the test coverage.
./bin/runDockerUnitTests.sh
- First creates the docker image and then starts a docker container in which the unit tests are executed.
License
Licensed under the 3-Clause BSD License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dd-import-1.0.12.tar.gz
.
File metadata
- Download URL: dd-import-1.0.12.tar.gz
- Upload date:
- Size: 11.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2f1091f0c1d57737a54a7cae99b9b6c74c1b3e2bb6b4b3808d384589e248b6af |
|
MD5 | ab97de01dcdcf9a7d80c4eb036fec0d4 |
|
BLAKE2b-256 | 71d4c5b13e88a75b00d66d8b5137d90754b9aa7963bbe6b4ea9902811c4348f4 |
File details
Details for the file dd_import-1.0.12-py3-none-any.whl
.
File metadata
- Download URL: dd_import-1.0.12-py3-none-any.whl
- Upload date:
- Size: 10.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a269fedc66829821787901e46ab770932a09820371f3adc00dc93170df13006 |
|
MD5 | b2dd19c5c96abc7f726c812d73cd8e58 |
|
BLAKE2b-256 | 61eec71f6c5a642edcd0b4728ee61dffa52186646fcb22c219dcd2cf075e3c4e |