Skip to main content

To regenerate pytest fixtures and test methods dynamically from OpenAPI spec and Cloudvector APIShark events

Project description

NOTE: This repo is now maintained at https://github.com/imperva/impact

Impact

Python Runtime and Package Installation

First of it, it is assumed that python3.8 and pip3 are installed. And impact package is installed by pip3. The python3 command can sometimes just be "python" if your default python installation is version 3 or above. Please run "python --version" to find out. If you are running python 3 or above by default, please simply substitute the "python3" commands in examples provided in the remainder of this document.

Ensure impact is available and up-to date, please run:

pip3 install -U impactrun

Test Directory

Create a Test directory where the spec files and config file can be placed. Please feel free to rename the test directory. The subdirectory structure is important for the test run. All files generated will be put under the test directory.

Config

The file cv_config.yaml is used to specify the authentication API endpoint and the credentials to get the access token which is used to fuzz other APIs.

There will be information such as the URL of your test application (API endpoint), the list of the fuzz attacks to try etc. in the cv_config.yaml which can be customized as per user environment. The same file contains all of the custom variables one needs to change. Current values are provided as examples.

In the Test directory create a folder called 'specs' and place all the APIspecs (JSON version only) here.

After the test is complete (details in sections below), the summary-<timestamp>.html file will contain pointers to all the test results. In addition, a file called fordev-apis.csv is generated. This is a highlevel summary for consumption of a dev team. It highlight just the API endpoints that seem to "fail" the test, ie. responding positively instead of rejecting the fuzz calls. Please feel free to import such CSV report to a spreadsheet.

The test results are stored in

results
results/perapi
results/perattack

Test can run for a long time, so one can adjust the spec and the collection of attacks in cv_config.yaml to stage each run. Test results of different test will not over-write each other. You can regenerate test report after the test run.

Generate fuzzing test for all the specs

With a given impast version and a set of specs, you need to only run this once.

impastrun --generate-tests
impastrun --g

A successfully run fuzzallspecs will generate as output a list of spec title names (taken from the spec's title) that can be used to update runall.py list for test control (later 4. Control test)

Running Tests

To start the tests execute below command:

impastrun --execute-tests
impastrun --e

Above impastrun also takes a "--regen-report" argument. Regenrate report will tells it not to run the long test, but just run the ImpactCore.generate_fuzz_report to again generate the report (it copies the saved report.json from results directory)

It creates a summary-<timestamp>.html in the test. It contains tables allowing convenient access to all the reports

Results are saved in a directory called results

results
    results/perapi
    results/perattack

After the test is finished you can find subdirectories with the Spec names under each of these results directories. There are .html files that are the report html pointed to by the summary.

Under the perapi directory there are files that are named after the API name (chopped from the test directory long "for_fuzzing.py" name). The report.json of the test run is saved with <apiname>-report.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

impactrun-2.1.0.tar.gz (7.7 MB view details)

Uploaded Source

Built Distribution

impactrun-2.1.0-py3-none-any.whl (7.8 MB view details)

Uploaded Python 3

File details

Details for the file impactrun-2.1.0.tar.gz.

File metadata

  • Download URL: impactrun-2.1.0.tar.gz
  • Upload date:
  • Size: 7.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.11

File hashes

Hashes for impactrun-2.1.0.tar.gz
Algorithm Hash digest
SHA256 ede7823efc7863a8eb124987ce95795f9a736954da0b4ec4737a7a11b7ef2b49
MD5 aa654aa72aa984b8f9f0dbaf26c2d45d
BLAKE2b-256 6374f1ded3b3e3c8d4491ac72121d42b2d98e2ccc46e8d886cfad5a00552ce0d

See more details on using hashes here.

File details

Details for the file impactrun-2.1.0-py3-none-any.whl.

File metadata

  • Download URL: impactrun-2.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.11

File hashes

Hashes for impactrun-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 18058cdfd9cea4910f2212b2db5d8ca816474f301ed072a20c74885004ceecbd
MD5 e36098e5ea942007f5a3c8211b8639fb
BLAKE2b-256 c65f2d865d2adfcd34f9829d0b6074713834aa5fc59a42d229955c74bdd37595

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page