Skip to main content

AI-powered testing swarm

Project description

AI Testing Swarm

AI Testing Swarm is a super-advanced, mutation-driven API testing framework (with optional OpenAPI + OpenAI augmentation) built on top of pytest.

It generates a large set of deterministic negative/edge/security test cases for an API request, executes them (optionally in parallel, with retries/throttling), and produces a JSON report with summaries.

Notes:

  • UI testing is not the focus of the current releases.
  • OpenAI features are optional and disabled by default.

Installation

pip install ai-testing-swarm

CLI entrypoint:

ai-test --help

Quick start (cURL input)

Create request.json:

{
  "curl": "curl --location https://postman-echo.com/post --header \"Content-Type: application/json\" --data \"{\\\"hello\\\":\\\"world\\\",\\\"count\\\":1}\""
}

Run:

ai-test --input request.json

A JSON report is written under:

  • ./ai_swarm_reports/<METHOD>_<endpoint>/<METHOD>_<endpoint>_<timestamp>.json

Reports include:

  • per-test results
  • summary counts by status code / failure type
  • optional AI summary (if enabled)

Input formats

1) Raw cURL

{ "curl": "curl ..." }

2) Normalized request

{
  "method": "POST",
  "url": "https://example.com/api/login",
  "headers": {"content-type": "application/json"},
  "params": {"a": "b"},
  "body": {"username": "u", "password": "p"}
}

3) OpenAPI-driven (optional)

{
  "openapi": "./openapi.json",
  "path": "/pets",
  "method": "get",
  "headers": {"accept": "application/json"},
  "path_params": {"petId": "123"},
  "query_params": {"limit": 10},
  "body": null
}
  • OpenAPI JSON works by default.
  • OpenAPI YAML requires PyYAML installed.
  • Base URL is read from spec.servers[0].url.
    • Override with AI_SWARM_OPENAPI_BASE_URL if your spec doesn’t include servers.

What test cases are generated?

The swarm always includes:

  • happy_path (baseline)

Then generates broad coverage across:

  • Method misuse: same path with wrong HTTP methods (GET/PUT/PATCH/DELETE etc.)
  • Headers: missing/invalid Content-Type, accept variations, and other header tampering
  • Auth (if Authorization header exists): missing/invalid token tests
  • Body/query mutations (per field):
    • missing / null / empty / whitespace
    • type probes (int/bool/float/array/object)
    • boundary inputs (very long strings, huge ints, negative values)
    • unicode + special character payloads
  • Security payload probes (per field): SQLi/XSS/path traversal/log4j patterns
  • Whole-body mutations: null body, empty object, extra unexpected field

Output is deterministic unless OpenAI augmentation is enabled.


Safety mode (recommended for CI/demos)

Mutation testing can be noisy and may accidentally stress a real environment. To force safe demo runs only against public test hosts:

ai-test --input request.json --public-only

Or via env:

export AI_SWARM_PUBLIC_ONLY=1

Allowed hosts in public-only mode:

  • httpbin.org
  • postman-echo.com
  • reqres.in

Performance features

Parallel execution

  • Enabled by default via thread pool.
  • Control with:
    • AI_SWARM_WORKERS (default: 5)

Retry + backoff (flaky endpoints)

  • Retries on transient errors and status codes (408/429/5xx etc.)
  • Control with:
    • AI_SWARM_RETRY_COUNT (default: 1)
    • AI_SWARM_RETRY_BACKOFF_MS (default: 250)

Throttling (RPS)

  • Global throttle to avoid hammering a target:
    • AI_SWARM_RPS (default: 0 = disabled)

Max test cap

  • Avoids accidental DoS / CI timeouts:
    • AI_SWARM_MAX_TESTS (default: 80)

Reporting

Reports include:

  • summary.counts_by_failure_type
  • summary.counts_by_status_code
  • summary.slow_tests (based on SLA)

SLA threshold:

  • AI_SWARM_SLA_MS (default: 2000)

Security:

  • Sensitive headers are redacted in the report (Authorization/Cookie/api tokens etc.)

Optional OpenAI augmentation (advanced)

A) Generate additional test cases (planner augmentation)

Enable:

export AI_SWARM_USE_OPENAI=1
export OPENAI_API_KEY=... 
export AI_SWARM_MAX_AI_TESTS=30

B) Human-readable AI summary in report

Enable:

export AI_SWARM_USE_OPENAI=1
export AI_SWARM_AI_SUMMARY=1
export OPENAI_API_KEY=...

Model selection:

  • AI_SWARM_OPENAI_MODEL (default: gpt-4.1-mini)

CLI help

ai-test --help

Release decisions

The swarm produces a release decision:

  • APPROVE_RELEASE
  • APPROVE_RELEASE_WITH_RISKS
  • REJECT_RELEASE

The decision is derived from deterministic rules (not an LLM).


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_testing_swarm-0.1.11.tar.gz (21.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_testing_swarm-0.1.11-py3-none-any.whl (24.7 kB view details)

Uploaded Python 3

File details

Details for the file ai_testing_swarm-0.1.11.tar.gz.

File metadata

  • Download URL: ai_testing_swarm-0.1.11.tar.gz
  • Upload date:
  • Size: 21.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for ai_testing_swarm-0.1.11.tar.gz
Algorithm Hash digest
SHA256 dfa197f7ab9d0c0cad7cef07ab25ac122c4e9d4a4acf91dbdf8220a6cff5a3be
MD5 d5c19ff5bfb2e7bdd91acccb0f5d1064
BLAKE2b-256 94a427e3743ee84c79c0928606aea3d2d4937c245493085c9b6cc558ae4540b8

See more details on using hashes here.

File details

Details for the file ai_testing_swarm-0.1.11-py3-none-any.whl.

File metadata

File hashes

Hashes for ai_testing_swarm-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 aaac0292f12a9a0e9802ffcdedec5cb778cf921d5a02a9ac83c53c2c6cd138bc
MD5 337019d899377cf1015d5a71b66d2fc4
BLAKE2b-256 f6f45855c11c14464d13c1371f39d1513417f976d5be634dabba110395c0cce1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page