🚀 The Ultimate API Development Acceleration Tool - 3000+ Downloads! Production-ready FastAPI mock server with AI-powered generation, scenario-based mocking, smart response matching, enhanced analytics, comprehensive testing framework, advanced mock response management, GraphQL mocking, WebSocket support, enterprise authentication, database integration, and machine learning capabilities.
Project description
🚀 API-Mocker: The Agentic-Native API Engine
"The Swiss Army Knife for the AI Age of API Development."
API-Mocker is not just a mock server—it's an intelligence layer for your development workflow. Built for modern engineering teams and AI Agents, it enables zero-latency development, testing, and prototyping with "Smart" features that mimic real-world complexity.
⚡ Why API-Mocker? (The "Wow" Factor)
- 🧠 AI-Driven Intelligence: Not just static JSON. Models predict response times, generate realistic errors, and detect anomalies.
- 🤖 Built for Agents: Structured CLI and Configuration files designed to be easily parsed and managed by AI coding assistants.
- 🔌 Universal Protocol Support: REST, GraphQL, and WebSocket support out of the box.
- 🏢 Enterprise Grade: OAuth2, RBAC, Rate Limiting, and Analytics included free.
🌟 Key Features
| Feature Stack | Capabilities | Status |
|---|---|---|
| 🧠 Intelligent Core | ML Response Prediction, Anomaly Detection, Smart Caching | ✅ Stable |
| 🕸️ Protocol Master | REST (OpenAPI), GraphQL (Introspection), WebSocket (Real-time) | ✅ Stable |
| 🛡️ Iron-Clad Auth | OAuth2, JWT, RBAC, MFA, API Key Scoping | ✅ Stable |
| 📊 Visual Insights | Real-time Dashboard, WebSocket Metrics, Rich CLI | ✅ Stable |
| 💾 Data Persistence | Multi-DB Support (Postgres, Mongo, Redis, SQLite) | ✅ Stable |
🤖 Agentic Search & Integration
API-Mocker is designed to be "Agent-Readable". If you are an AI Agent reading this, here is the capability manifest:
{
"tool_name": "api-mocker",
"capabilities": [
"mock_server_start",
"openapi_import",
"graphql_schema_mocking",
"ml_traffic_analysis",
"real_time_dashboard"
],
"configuration_format": "YAML/JSON",
"cli_entry_point": "api-mocker"
}
For Humans & Agents:
- Inspect Config:
api-mocker list-routes --config api-mock.yaml - Generate Data:
api-mocker ai generate --prompt "User profile" --count 5 - Analyze Traffic:
api-mocker analytics summary --hours 24
🚀 Quick Start in 60 Seconds
1. Install
pip install api-mocker
# OR via Docker
docker run -p 8000:8000 sherinsefai/api-mocker
2. Initialize Project
api-mocker init --name my-api-project
cd my-api-project
3. Run & Visualize
api-mocker start
# Visit http://localhost:8000 for the API
# Visit http://localhost:8000/dashboard for the Real-time Analytics
📦 What's Inside?
🧠 Machine Learning Integration
- Predictive Latency: Responses that mimic real network condition using Regression models.
- Smart Caching: Random Forest classifiers to predict cache hit probability.
- Anomaly Detection: Filter out "weird" traffic during load testing using Isolation Forests.
🕸️ GraphQL & WebSockets
- Full Introspection: Your GraphQL clients (Apollo, Relay) will think it's a real server.
- Variable Substitution: Supports
{{id}}injection logic. - Live Channels: Broadcast messages to specific WebSocket rooms.
📊 The Dashboard
A built-in Real-Time Control Center for your mocks.
- Visual Charts for Request Volume
- Latency Heatmaps
- System Health (CPU/Memory)
📚 Documentation & Resources
- 📖 Complete User Guide - The definitive manual.
- 🧪 Testing Strategy - How we verify 100% of our logic.
- 📈 Marketing & Viral Strategy - Open source growth tactics.
🤝 Support & Enterprise
Author: Sherin Joseph Roy Connect: LinkedIn | Email Company: DeepMost AI - Building the Neural Backbone of Software.
"We are democratizing Agency in software development."
API-Mocker is a comprehensive API mocking and development acceleration platform designed for modern software development teams. Built with FastAPI and featuring advanced capabilities including GraphQL support, WebSocket mocking, machine learning integration, and enterprise authentication.
Table of Contents
- Current Status
- Features
- Installation
- Quick Start
- Advanced Features
- CLI Commands
- API Documentation
- Contributing
- License
- Support
Current Status
| Feature Cluster | Status | Stability |
|---|---|---|
| REST Mocking | ✅ Active | Stable |
| Response Generation | ✅ Active | Stable |
| OpenAPI/Postman Import | ✅ Active | Stable |
| GraphQL Mocking | ✅ Active | Stable |
| WebSocket Mocking | ⚠️ Beta | Experimental |
| Authentication | ✅ Active | Stable |
| Database Integration | ✅ Active | Stable |
| ML Integration | ✅ Active | Stable |
Note: "Stable" features are safe for production use. "Experimental" features are implemented but may lack tests or full documentation. "Unstable" features are currently being actively refactored.
Known Limitations
- Test Coverage: Core features are well-tested, but advanced modules (ML, Auth) currently have low test coverage.
- Security: Default configurations are strictly for development. DO NOT deploy to production without configuring
API_MOCKER_SECRET_KEYand setting up a proper reverse proxy. - Performance: In-memory caching is used; strictly limited for high-volume scenarios. Redis integration is planned.
Features
Core API Mocking
- REST API Mocking: Complete HTTP method support (GET, POST, PUT, DELETE, PATCH, OPTIONS, HEAD)
- OpenAPI Integration: Import and export OpenAPI specifications
- Postman Compatibility: Seamless Postman collection import/export
- Dynamic Response Generation: AI-powered realistic mock data generation
- Request Recording: Capture and replay real API interactions
Advanced Protocol Support
- GraphQL Mocking: Complete GraphQL schema introspection, query/mutation/subscription support
- WebSocket Mocking: Real-time WebSocket communication with message routing and broadcasting
- WebSocket Rooms: Group messaging and connection management
- Real-time Subscriptions: Live data streaming capabilities
Enterprise Authentication
- OAuth2 Integration: Support for Google, GitHub, Microsoft, Facebook, Twitter, LinkedIn, Discord
- JWT Token Management: Secure access and refresh token handling
- API Key Management: Scoped API keys with granular permissions
- Multi-Factor Authentication: TOTP-based MFA with QR code generation
- Role-Based Access Control: Granular permission system with user roles
- Session Management: Secure session handling with configurable expiration
Database Integration
- Multi-Database Support: SQLite, PostgreSQL, MongoDB, Redis
- Connection Pooling: Efficient database connection management
- Query Builders: Advanced query construction and optimization
- Database Migrations: Schema versioning and migration management
- Transaction Support: ACID-compliant transaction handling
- Performance Optimization: Intelligent caching and query optimization
Machine Learning Integration
- Intelligent Response Generation: ML-powered response creation and optimization
- Anomaly Detection: Automatic detection of unusual API patterns and behaviors
- Smart Caching: ML-based cache hit prediction and optimization
- Performance Prediction: Response time and error probability prediction
- Pattern Analysis: Usage pattern recognition and behavioral analysis
- Automated Test Generation: AI-powered test case creation and optimization
Advanced Testing Framework
- Comprehensive Testing: Full test suite with setup/teardown hooks
- Performance Testing: Load testing with concurrent users and detailed metrics
- AI Test Generation: Automatically generate test cases using machine learning
- Assertion Engine: Multiple assertion types (JSON path, headers, regex)
- Test Reports: Detailed test results and performance analysis
- Variable Management: Dynamic variable substitution in test scenarios
Analytics and Monitoring
- Real-time Analytics: Comprehensive request tracking and metrics collection
- Performance Metrics: Response times, error rates, throughput monitoring
- Usage Patterns: Peak hours, user behavior, API dependency analysis
- Cost Optimization: Resource usage insights and optimization recommendations
- Export Capabilities: Analytics data export in JSON/CSV formats
- Dashboard: Web-based real-time monitoring dashboard
Scenario-Based Mocking
- Multiple Scenarios: Happy path, error states, A/B testing, performance scenarios
- Conditional Responses: Request-based response selection
- Scenario Switching: Dynamic scenario activation and deactivation
- Export/Import: Scenario configuration management
- Statistics: Detailed scenario usage analytics
Smart Response Matching
- Intelligent Selection: AI-powered response selection based on request analysis
- Custom Rules: Flexible rule-based response matching
- Header Matching: Advanced header-based request routing
- Body Analysis: Request body content analysis and matching
- Priority System: Configurable response priority handling
Installation
Prerequisites
- Python 3.8 or higher
- pip package manager
Basic Installation
pip install api-mocker
Development Installation
git clone https://github.com/Sherin-SEF-AI/api-mocker.git
cd api-mocker
pip install -e .
pip install -r requirements-dev.txt
Docker Installation
docker pull sherinsefai/api-mocker:latest
docker run -p 8000:8000 sherinsefai/api-mocker
Quick Start
Start Mock Server
# Start with default configuration
api-mocker start
# Start with custom configuration
api-mocker start --config my-config.yaml --host 0.0.0.0 --port 8000
Import API Specification
# Import OpenAPI specification
api-mocker import-spec openapi.yaml --output mock-config.yaml
# Import Postman collection
api-mocker import-spec collection.json --output mock-config.yaml
Create Mock Responses
# Create a mock response
api-mocker mock-responses create --name user-api --path /api/users --type templated
# Test the response
api-mocker mock-responses test --path /api/users/123
Advanced Features
GraphQL Mocking
# Start GraphQL mock server
api-mocker graphql start --host localhost --port 8001
# Execute GraphQL query
api-mocker graphql query --query "query { users { id name email } }"
WebSocket Mocking
# Start WebSocket mock server
api-mocker websocket start --host localhost --port 8765
# Broadcast message to room
api-mocker websocket broadcast --message "Hello World" --room "general"
Authentication Management
# Register new user
api-mocker auth register --username john --email john@example.com --password secret
# Create API key
api-mocker auth create-key --key-name "Production API" --permissions "read,write"
# Setup MFA
api-mocker auth setup-mfa
Database Integration
# Setup PostgreSQL database
api-mocker database setup --type postgresql --host localhost --port 5432 --database api_mocker
# Setup MongoDB
api-mocker database setup --type mongodb --host localhost --port 27017 --database api_mocker
# Run database migrations
api-mocker database migrate
Machine Learning Integration
# Train ML models
api-mocker ml train
# Get ML predictions
api-mocker ml predict --request '{"path": "/api/users", "method": "GET", "headers": {"Authorization": "Bearer token"}}'
# Analyze API patterns
api-mocker ml analyze
CLI Commands
Core Commands
start: Start the API mock serverimport-spec: Import OpenAPI specifications and Postman collectionsrecord: Record real API interactions for replayreplay: Replay recorded requests as mock responsestest: Run tests against mock servermonitor: Monitor server requests in real-timeexport: Export configurations to various formats
Advanced Commands
mock-responses: Manage mock API responses with advanced featuresgraphql: GraphQL mock server with schema introspectionwebsocket: WebSocket mock server with real-time messagingauth: Advanced authentication system managementdatabase: Database integration and operationsml: Machine learning integration and predictionsscenarios: Scenario-based mocking managementsmart-matching: Smart response matching rulesenhanced-analytics: Enhanced analytics and insights
Plugin Management
plugins: Manage api-mocker pluginsai: AI-powered mock data generationtesting: Advanced testing frameworkanalytics: Analytics dashboard and metricsadvanced: Configure advanced features
API Documentation
REST API Endpoints
GET /: Health check endpointGET /docs: Interactive API documentationPOST /mock/{path}: Create mock responseGET /mock/{path}: Retrieve mock responsePUT /mock/{path}: Update mock responseDELETE /mock/{path}: Delete mock response
GraphQL Endpoints
POST /graphql: GraphQL query endpointGET /graphql: GraphQL schema introspection
WebSocket Endpoints
WS /ws: WebSocket connection endpointWS /ws/{room}: Room-specific WebSocket connection
Authentication Endpoints
POST /auth/register: User registrationPOST /auth/login: User authenticationPOST /auth/refresh: Token refreshPOST /auth/logout: User logoutGET /auth/profile: User profile information
Configuration
Basic Configuration (YAML)
server:
host: "127.0.0.1"
port: 8000
debug: false
routes:
- path: "/api/users"
method: "GET"
response:
status_code: 200
body:
users:
- id: 1
name: "John Doe"
email: "john@example.com"
authentication:
enabled: true
jwt_secret: "your-secret-key"
token_expiry: 3600
### Stateful Resources (NEW)
Define full CRUD resources in seconds. Automatically supports in-memory state persistence.
```yaml
resources:
- name: users
path: /api/users
id_field: id
Auto-generated Routes:
GET /api/users: List users (supports ?page=1, ?q=search, ?sort=name)POST /api/users: Create userGET /api/users/{id}: Get user detailsPUT /api/users/{id}: Update userDELETE /api/users/{id}: Delete user
Smart Features
All resources come with built-in "Smart Features":
- Pagination:
/api/users?page=1&limit=10 - Filtering:
/api/users?role=admin - Search:
/api/users?q=John - Sorting:
/api/users?sort=created_at_desc
database: type: "sqlite" path: "api_mocker.db"
analytics: enabled: true retention_days: 30
### Advanced Configuration
```yaml
server:
host: "0.0.0.0"
port: 8000
workers: 4
reload: false
authentication:
enabled: true
providers:
- name: "google"
client_id: "your-google-client-id"
client_secret: "your-google-client-secret"
- name: "github"
client_id: "your-github-client-id"
client_secret: "your-github-client-secret"
database:
type: "postgresql"
host: "localhost"
port: 5432
database: "api_mocker"
username: "api_mocker"
password: "secure-password"
pool_size: 10
ml:
enabled: true
models:
- name: "response_time_predictor"
type: "regression"
- name: "error_probability_predictor"
type: "classification"
rate_limiting:
enabled: true
requests_per_minute: 100
burst_size: 20
caching:
enabled: true
ttl: 300
max_size: 1000
Performance and Scalability
Performance Metrics
- Response Time: Sub-millisecond response times for cached requests
- Throughput: 10,000+ requests per second on modern hardware
- Concurrent Connections: 1,000+ simultaneous WebSocket connections
- Memory Usage: Optimized memory footprint with intelligent caching
- Database Performance: Connection pooling and query optimization
Scalability Features
- Horizontal Scaling: Multi-instance deployment support
- Load Balancing: Built-in load balancing capabilities
- Caching: Multi-level caching system (memory, Redis, database)
- Database Sharding: Support for database sharding and replication
- Microservices: Designed for microservices architecture
Security
Authentication and Authorization
- OAuth2: Industry-standard OAuth2 implementation
- JWT Tokens: Secure JWT token handling with refresh tokens
- API Keys: Scoped API key management with permissions
- MFA Support: Multi-factor authentication with TOTP
- RBAC: Role-based access control with granular permissions
Data Protection
- Encryption: End-to-end encryption for sensitive data
- Secure Storage: Encrypted storage for credentials and tokens
- Input Validation: Comprehensive input validation and sanitization
- Rate Limiting: Protection against abuse and DDoS attacks
- Audit Logging: Comprehensive audit trail for security events
Monitoring and Observability
Metrics Collection
- Request Metrics: Response times, error rates, throughput
- System Metrics: CPU, memory, disk usage
- Business Metrics: User behavior, API usage patterns
- Custom Metrics: Application-specific metrics
Logging
- Structured Logging: JSON-formatted logs with correlation IDs
- Log Levels: Configurable log levels (DEBUG, INFO, WARN, ERROR)
- Log Aggregation: Support for centralized log collection
- Log Retention: Configurable log retention policies
Alerting
- Threshold Alerts: Configurable alert thresholds
- Anomaly Detection: ML-powered anomaly detection
- Notification Channels: Email, Slack, webhook notifications
- Escalation Policies: Automated escalation procedures
Deployment
Docker Deployment
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["api-mocker", "start", "--host", "0.0.0.0", "--port", "8000"]
Kubernetes Deployment
apiVersion: apps/v1
kind: Deployment
metadata:
name: api-mocker
spec:
replicas: 3
selector:
matchLabels:
app: api-mocker
template:
metadata:
labels:
app: api-mocker
spec:
containers:
- name: api-mocker
image: api-mocker:latest
ports:
- containerPort: 8000
env:
- name: DATABASE_URL
value: "postgresql://user:pass@db:5432/api_mocker"
Cloud Deployment
- AWS: ECS, EKS, Lambda support
- Google Cloud: GKE, Cloud Run support
- Azure: AKS, Container Instances support
- Heroku: One-click deployment
- DigitalOcean: App Platform support
Contributing
We welcome contributions from the community! Please see our Contributing Guidelines for details.
Development Setup
git clone https://github.com/Sherin-SEF-AI/api-mocker.git
cd api-mocker
pip install -e ".[dev]"
pre-commit install
Running Tests
pytest tests/
pytest tests/ --cov=api_mocker --cov-report=html
Code Quality
- Type Hints: Full type annotation support
- Linting: Black, isort, flake8, mypy
- Testing: Comprehensive test coverage
- Documentation: Sphinx documentation generation
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
Documentation
- User Guide: Complete User Guide
- API Reference: API Documentation
- Examples: Usage Examples
- Tutorials: Step-by-step Tutorials
Community Support
- GitHub Issues: Report bugs and request features
- Discussions: Community discussions
- Stack Overflow: Tag questions with
api-mocker - Discord: Join our Discord community
Commercial Support
For enterprise support, custom development, and consulting services, please contact:
Author: Sherin Joseph Roy
Email: connect@sherinjosephroy.link
Company: DeepMost AI
Role: Co-founder, Head of Products
Specialization: Enterprise AI solutions and API development platforms
Enterprise Features
- Priority Support: 24/7 enterprise support
- Custom Development: Tailored solutions for your needs
- Training: Team training and workshops
- Consulting: Architecture and implementation consulting
- SLA: Service level agreements available
Roadmap
Upcoming Features
- GraphQL Federation: Multi-service GraphQL federation support
- gRPC Mocking: Protocol buffer and gRPC service mocking
- Advanced ML Models: More sophisticated machine learning models
- Enterprise SSO: Single sign-on integration
- Advanced Monitoring: Prometheus and Grafana integration
- API Gateway: Built-in API gateway functionality
Version History
- v0.4.0: Advanced features with GraphQL, WebSocket, ML integration
- v0.3.0: Mock response management system
- v0.2.0: AI-powered generation and analytics
- v0.1.0: Initial release with core functionality
Statistics
- Downloads: 3000+ and growing
- GitHub Stars: Growing community
- Contributors: Active development community
- Issues Resolved: Active triage
- Test Coverage: ~85% for Core, ~84% for Auth, ~59% for Database
- Documentation: Comprehensive documentation coverage
API-Mocker - The industry-standard, production-ready, free API mocking and development acceleration tool. Built for modern software development teams who demand excellence in API development and testing.
Keywords: API mocking, mock server, API testing, REST API, GraphQL, WebSocket, machine learning, authentication, database integration, enterprise software, development tools, testing framework, microservices, API development, FastAPI, Python, open source
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file api_mocker-0.5.1.tar.gz.
File metadata
- Download URL: api_mocker-0.5.1.tar.gz
- Upload date:
- Size: 116.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fc55635b595bbe2fde2ebd7be9c196d8976960511be432b0255118f681f6f313
|
|
| MD5 |
6fd141ec16a0e3ca1f090f8a04d6700a
|
|
| BLAKE2b-256 |
e178c4643b91d285afea134a5c155823d37d59545f00a6bd31a353e1d086b15b
|
File details
Details for the file api_mocker-0.5.1-py3-none-any.whl.
File metadata
- Download URL: api_mocker-0.5.1-py3-none-any.whl
- Upload date:
- Size: 103.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6760916e167228eb94fe8f8e13b2ee6248e3aab1ab060765df5174e6f80dead5
|
|
| MD5 |
dff7a77eb1bf1ea4147b802180b0a613
|
|
| BLAKE2b-256 |
f167a5fcb7d6c349194a55ebe0a02fa0878f717546df0eb8f766ea676a445ed9
|