AI-powered API test case generator. Point it at any codebase, URL, or GitHub repo and get instant runnable tests.
Project description
apisnap
AI-powered API test case generator. Point it at any codebase, URL, or GitHub repo and get instant runnable tests.
What is apisnap?
apisnap is a CLI tool that automatically generates API test cases using AI. It discovers API endpoints from your project and generates comprehensive test cases in your preferred framework.
Key Features
- Auto-discovery: Scans your codebase to find API endpoints automatically
- Multi-format support: Outputs tests in pytest, unittest, jest, mocha, vitest, and more
- GitHub-as-database support: Special support for repos that use GitHub as a serverless database
- AI-powered: Uses Cerebras AI to generate intelligent, comprehensive test cases
The GitHub-as-Database Pattern
apisnap has special support for the increasingly popular "GitHub-as-database" serverless API pattern. This pattern works by:
- A GitHub Actions workflow runs on a cron schedule
- The workflow fetches data from an external API
- The data is committed as JSON files to the repository
- Those JSON files are served via GitHub Pages or Cloudflare Pages
This creates a completely free, serverless, zero-maintenance JSON API - and apisnap can generate tests for it automatically!
Installation
# Recommended: use uvx (no install needed)
uvx apisnap scan --url https://github.com/user/repo
# Install with uv
uv add apisnap --dev
# Install with pip
pip install apisnap
Quick Start
1. Configure your API key
apisnap config --api-key sk-your-key-here
2. Scan a GitHub repo (GitHub-as-database pattern)
apisnap scan --url https://github.com/user/guitar-chords-repo
3. Scan a local project
apisnap scan ./src
4. Scan an OpenAPI URL
apisnap scan --url https://api.example.com/openapi.json
5. Preview routes without generating tests
apisnap scan --url https://github.com/user/repo --dry-run
Diagrams
How apisnap works — system overview
flowchart TB
subgraph Inputs[Input Modes]
M1[Local Code Scan]
M2[OpenAPI URL]
M3[JSON URL]
M4[Deployed URL]
M5[GitHub Repo]
end
M1 --> Detector
M2 --> Detector
M3 --> Detector
M4 --> Detector
M5 --> Detector
Detector{Auto-Detector} --> Manifest[Route Manifest]
Manifest --> AI[Ai Engine Cerebras]
AI --> Writers[Test Writers]
subgraph Writers[Test Writers]
W1[pytest]
W2[unittest]
W3[jest]
W4[mocha]
W5[vitest]
W6[restassured]
W7[rspec]
W8[httpx]
end
Writers --> Output[Output Files]
The GitHub-as-database serverless API pattern
flowchart LR
External[External API<br/>weather, prices, sports] -->|HTTP Request| Workflow[GitHub Actions: cron]
Workflow -->|Fetch & Transform| Commit[Commit JSON files]
Commit --> Repo[GitHub Repository<br/>data/chords.json<br/>data/prices.json<br/>public/items.json]
Repo -->|Serve| Pages[Public JSON APIs]
subgraph Pages[Hosting Options]
direction TB
GH[GitHub Pages<br/>user.github.io/repo]
CF[Cloudflare Pages<br/>repo.pages.dev]
Custom[Custom Domain<br/>api.example.com]
end
GH --> URLs[Public JSON API URLs]
CF --> URLs
Custom --> URLs
How apisnap scans a GitHub-as-database repo
flowchart TB
Input[Input: GitHub Repo URL] --> Step1[Step 1: Fetch Repo Tree<br/>API: /repos/{owner}/{repo}/git/trees]
Step1 --> Step2[Step 2: Scan Generator Scripts<br/>.github/workflows/*.yml<br/>scripts/*.py]
Step2 --> Step2a[workflow: cron: 0 */6<br/>output: data/*.json]
Step2 --> Step2b[scripts: fetch_chords.py<br/>from external API]
Step2a --> Step3[Step 3: Find JSON Data Files<br/>data/*.json, public/*.json]
Step2b --> Step3
Step3 --> Step4[Step 4: Infer Schema<br/>Analyze JSON structure]
Step4 --> Step5[Step 5: Detect Public URL<br/>CNAME, wrangler.toml, docs/]
Step5 --> Step6[Step 6: Build RouteManifest<br/>method, path, public_url]
Step6 --> Output[RouteManifest to AI to Tests]
apisnap CLI modes and commands
flowchart TB
CLI[apisnap] --> Config[apisnap config]
CLI --> Scan[apisnap scan]
CLI --> List[apisnap list]
CLI --> Version[apisnap version]
subgraph Config[apisnap config]
C1[First-time setup]
C2[Prompts for Cerebras API key]
C3[Stores at: ~/.apisnap/config.toml]
C1 --> C2 --> C3
end
subgraph Scan[apisnap scan OPTIONS]
S1[Main command]
S2[--url: Remote URL]
S3[--format: pytest/jest/mocha]
S4[--output: Output directory]
S5[--dry-run: Show routes only]
S6[--verbose: Detailed progress]
S1 --> S2 & S3 & S4 & S5 & S6
end
subgraph List[apisnap list]
L1[Show discovered routes]
end
subgraph Version[apisnap version]
V1[Show version information]
end
CLI Commands Reference
apisnap config
First-time setup. Prompts for and stores Cerebras API key. Stores config at: ~/.apisnap/config.toml
| Option | Description |
|---|---|
--api-key TEXT |
Set Cerebras API key |
--show |
Show current configuration |
--format TEXT |
Default test format |
--output-dir TEXT |
Default output directory |
apisnap scan [PATH] [OPTIONS]
Main command. All input modes.
| Option | Description |
|---|---|
PATH |
Path to scan (default: current directory) |
--url TEXT |
Remote URL (GitHub repo, OpenAPI JSON, deployed URL) |
--format TEXT |
Test framework [pytest|jest|mocha|vitest|...] |
--output TEXT |
Output directory [default: ./tests] |
--framework TEXT |
Force framework detection |
--mode TEXT |
Force discovery mode [source|openapi|json|...] |
--dry-run |
Show routes without generating tests |
--base-url TEXT |
Base URL for test requests |
--verbose |
Detailed progress |
--no-ai |
Print manifest as JSON, skip test generation |
apisnap list [PATH]
Show discovered routes in a pretty-printed table.
apisnap version
Show version information.
### AI test generation pipeline
```mermaid
flowchart TB
Input[RouteManifest] --> Check{Confidence Check: confidence >= 0.8?}
Check -->|Yes| Pass1[Pass 1: Schema Refinement]
Check -->|No| LowConf[Use with lower confidence]
Pass1 --> Pass2[Pass 2: Test Generation]
LowConf --> Pass2
Pass2 --> BuildPrompt[Build prompt with: Method, Path, URL, Auth, Schema, Framework]
BuildPrompt --> Ai[Cerebras AI API<br/>Model: qwen-3-235b-a22b]
Ai --> Prompt[Prompt: "Generate tests for happy path, auth failure, schema validation"]
Prompt --> Writer[Framework Writer<br/>pytest, unittest, jest, mocha, vitest]
Writer --> Output[Test Files: test_api_*.py]
Internal route manifest structure
classDiagram
class RouteManifest {
+list routes
+str base_url
+str framework
+str project_name
+str source_mode
+str detected_at
}
class Route {
+str method
+str path
+list params
+dict body_schema
+dict response_schema
+bool auth_required
+str summary
+str source
+float confidence
+str refresh_schedule
+str public_url
}
class Param {
+str name
+str location
+str type
+bool required
+str description
}
RouteManifest --> Route : routes
Route --> Param : params
Supported Frameworks
Source Code Frameworks
| Framework | Detection Method | Example File |
|---|---|---|
| FastAPI | @app.get, @router.post, APIRouter |
main.py |
| Flask | Flask(, @app.route |
app.py |
| Django DRF | settings.py, @api_view |
views.py |
| Express | app.get, router.post, express |
index.js |
| Spring Boot | @RestController, @GetMapping |
Controller.java |
| Gin | gin.New(), r.GET, r.POST |
main.go |
| Rails | config/routes.rb, resources :users |
routes.rb |
Test Output Formats
| Format | Install Command | Language |
|---|---|---|
| pytest | pip install requests pytest |
Python |
| unittest | Built-in | Python |
| httpx | pip install httpx pytest-httpx |
Python |
| jest | npm install jest axios |
JavaScript |
| mocha | npm install mocha axios |
JavaScript |
| vitest | npm install vitest axios |
TypeScript |
| restassured | Maven dependency | Java |
| rspec | gem install rspec |
Ruby |
GitHub-as-Database Repos — Detailed Guide
What is this pattern?
The GitHub-as-database pattern is a way to create free, serverless JSON APIs using GitHub repositories. It's popular for:
- Public data that updates periodically (weather, prices, sports scores)
- Static datasets that need occasional updates
- Personal projects and side projects
- Prototyping and MVPs
How it works
- Create a GitHub repository
- Add a GitHub Actions workflow that runs on a cron schedule
- The workflow fetches data from an external API
- Commits the data as JSON files to the repo
- Serve via GitHub Pages or Cloudflare Pages
Example Workflow
name: Update Data
on:
schedule:
- cron: '0 */6 * * *' # Every 6 hours
jobs:
update:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Fetch data
run: python fetch_data.py
- name: Commit changes
uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "Update data"
file_pattern: "data/*.json"
What apisnap detects
When scanning a GitHub repo, apisnap:
- Finds workflow files - Extracts cron schedules, fetch URLs, output paths
- Finds JSON data files - In
/data/,/public/,/api/, etc. - Infers schemas - Analyzes JSON structure to infer types
- Detects public URL - GitHub Pages, Cloudflare Pages, or CNAME
- Generates tests - Validates URL, schema, types, CORS, caching
Configuration Reference
Config file location: ~/.apisnap/config.toml
[cerebras]
api_key = "sk-xxxx" # Your Cerebras API key
model = "qwen-3-235b-a22b-instruct-2507" # AI model to use
[defaults]
output_dir = "./tests" # Default output directory
format = "pytest" # Default test framework
Setting up the API key
# Interactive prompt
apisnap config
# Non-interactive
apisnap config --api-key sk-your-key-here
# Show current config
apisnap config --show
Contributing
Contributions are welcome! Here's how to get started:
-
Clone the repository
git clone https://github.com/chirag127/apisnap.git cd apisnap
-
Install dependencies
pip install -e ".[dev]"
-
Run tests
pytest tests/ -v
-
Add a new scanner
- Create
src/apisnap/scanner/source/newframework_scanner.py - Inherit from
BaseScanner - Implement
scan()andcan_handle()methods
- Create
-
Add a new writer
- Create
src/apisnap/writers/newformat_writer.py - Inherit from
BaseWriter - Implement
write()andwrite_file()methods
- Create
Publishing to PyPI
This package uses GitHub Actions for publishing to PyPI.
Setup
- Create a PyPI account at https://pypi.org/
- Configure trusted publishing in PyPI:
- Go to Project Settings → Publishing
- Add a new publisher
- Connect to GitHub
- Configure GitHub environment:
- Go to repo Settings → Environments
- Create
pypienvironment - Add trusted publisher
Publish a new version
# Update version in pyproject.toml
git commit -m "Bump version"
git tag v0.1.0
git push --tags
The GitHub Actions workflow will automatically publish to PyPI when you push a version tag.
License
MIT License - see LICENSE for details.
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file apisnap-0.1.0.tar.gz.
File metadata
- Download URL: apisnap-0.1.0.tar.gz
- Upload date:
- Size: 698.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca0eb5b1da78119534269d90893fa39ca532a1647b13d8d8c42f28eb14730c39
|
|
| MD5 |
40cc2d0031a68303233fe3b642c7b72c
|
|
| BLAKE2b-256 |
cd9d53321b06068bee3afabb5b5f4074fb0e73e884786b3ca592b05495d90c6a
|
File details
Details for the file apisnap-0.1.0-py3-none-any.whl.
File metadata
- Download URL: apisnap-0.1.0-py3-none-any.whl
- Upload date:
- Size: 52.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1884c5e8e74f4ffc6f3f91d126b60ee630de26d9d8c5e2e2589d725feb3bdf85
|
|
| MD5 |
2a7cb0d8219655ded558314c7fd4fec9
|
|
| BLAKE2b-256 |
5f02dea8a152bfd596a6ae0ad57b5ee49727ce28e6ef3baa2c59f2fbae01f1a0
|