Modern async Python framework for AI APIs with native Model Context Protocol (MCP) support
Project description
nzrApi Framework
🤖 Modern Async Python Framework for AI APIs with Native MCP Support
✨ What is nzrApi?
nzrApi is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.
🎯 Key Features
- 🤖 Native AI Model Integration - First-class support for multiple AI providers and custom models
- 🔄 Model Context Protocol (MCP) - Built-in MCP implementation for seamless n8n integration
- ⚡ High Performance - Async/await throughout with ASGI compliance
- 📊 Context Management - Persistent conversation contexts with automatic cleanup
- 🛡️ Production Ready - Rate limiting, authentication, monitoring, and error handling
- 🗄️ Database Integration - SQLAlchemy async with automatic migrations
- 🎨 DRF-Inspired Serializers - Familiar, powerful data validation and transformation
- 🚀 Auto-Generation - CLI tools for rapid project scaffolding
- 🐳 Cloud Native - Docker support with production configurations
🚀 Quick Start
Installation
pip install nzrapi
Create Your First AI API
# Create a new project
nzrapi new my-ai-api
# Navigate to project
cd my-ai-api
# Run the development server
nzrapi run --reload
Your AI API is now running at http://localhost:8000! 🎉
Hello World Example
from nzrapi import NzrApiApp, Router
app = NzrApiApp(title="My AI API")
router = Router()
@router.post("/chat")
async def chat(request):
data = await request.json()
# Use built-in AI model
model = request.app.ai_registry.get_model("default")
result = await model.predict({"message": data["message"]})
return {"response": result["response"]}
app.include_router(router)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
🤖 AI Model Integration
nzrApi makes it incredibly easy to work with AI models:
from nzrapi.ai.models import AIModel
class MyCustomModel(AIModel):
async def load_model(self):
# Load your model (PyTorch, HuggingFace, OpenAI, etc.)
self.model = load_my_model()
self.is_loaded = True
async def predict(self, payload, context=None):
# Make predictions with optional context
result = self.model.generate(payload["prompt"])
return {"response": result}
# Register and use
app.ai_registry.register_model_class("custom", MyCustomModel)
await app.ai_registry.add_model("my_model", "custom", config={...})
Supported AI Providers
- ✅ OpenAI (GPT-3.5, GPT-4, etc.)
- ✅ Anthropic (Claude models)
- ✅ HuggingFace (Transformers, Inference API)
- ✅ Custom Models (PyTorch, TensorFlow, etc.)
- ✅ Mock Models (for development and testing)
🔄 Model Context Protocol (MCP)
nzrApi implements the Model Context Protocol for stateful AI interactions:
# MCP-compliant endpoint
@router.post("/mcp/{model_name}/predict")
async def mcp_predict(request, model_name: str):
# Automatic context management
mcp_request = MCPRequest(**(await request.json()))
# Retrieve conversation context
context = await get_context(mcp_request.context_id)
# Make prediction with context
model = request.app.ai_registry.get_model(model_name)
result = await model.predict(mcp_request.payload, context)
# Return MCP-compliant response
return MCPResponse(
request_id=mcp_request.request_id,
context_id=mcp_request.context_id,
result=result
)
🎨 Powerful Serializers
nzrApi provides robust data validation:
from nzrapi.serializers import BaseSerializer, CharField, IntegerField
class ChatRequestSerializer(BaseSerializer):
message = CharField(max_length=1000)
user_id = CharField(required=False)
temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)
def validate(self, data):
# Custom validation logic
return data
# Use in endpoints
@router.post("/chat")
async def chat(request):
data = await request.json()
serializer = ChatRequestSerializer(data=data)
if serializer.is_valid():
validated_data = serializer.validated_data
# Process with confidence...
else:
return JSONResponse(serializer.errors, status_code=422)
🗄️ Database Integration
Built-in async database support with SQLAlchemy:
from nzrapi.db import Base
from sqlalchemy import Column, Integer, String, DateTime
class ConversationHistory(Base):
__tablename__ = "conversations"
id = Column(Integer, primary_key=True)
user_id = Column(String(255), index=True)
message = Column(Text)
response = Column(Text)
created_at = Column(DateTime, default=datetime.utcnow)
# Use in endpoints
@router.post("/chat")
async def chat(request):
async with request.app.get_db_session() as session:
# Save conversation
conversation = ConversationHistory(
user_id=user_id,
message=message,
response=response
)
session.add(conversation)
await session.commit()
🛡️ Production Features
Rate Limiting
from nzrapi.middleware import RateLimitMiddleware
app.add_middleware(
RateLimitMiddleware,
calls_per_minute=60,
calls_per_hour=1000
)
Authentication
from nzrapi.middleware import AuthenticationMiddleware
app.add_middleware(
AuthenticationMiddleware,
secret_key="your-secret-key"
)
CORS for n8n
from starlette.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["https://app.n8n.cloud"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"]
)
🔧 CLI Tools
nzrApi includes powerful CLI tools for development:
# Create new project
nzrapi new my-project --template mcp-server
# Run development server
nzrapi run --reload --port 8000
# Database migrations
nzrapi migrate -m "Add user table"
nzrapi migrate --upgrade
# Model management
nzrapi models --list
nzrapi models --add openai_gpt4 --type openai
# Project info
nzrapi info
🌐 n8n Integration
Perfect for n8n workflows with built-in MCP support:
{
"nodes": [{
"name": "AI Chat",
"type": "n8n-nodes-base.httpRequest",
"parameters": {
"url": "http://your-api.com/api/v1/mcp/gpt4/predict",
"method": "POST",
"body": {
"context_id": "{{ $json.session_id }}",
"payload": {
"message": "{{ $json.user_input }}"
}
}
}
}]
}
📊 Monitoring & Observability
Built-in monitoring capabilities:
# Health checks
GET /health
GET /api/v1/models/{name}/health
# Metrics
GET /metrics
GET /api/v1/stats
# Usage analytics
GET /api/v1/usage/models
GET /api/v1/conversations/{context_id}
🐳 Docker Deployment
Production-ready containers:
FROM python:3.11-slim
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
# Build and run
docker build -t my-ai-api .
docker run -p 8000:8000 my-ai-api
# Or use docker-compose
docker-compose up -d
📚 Examples
Check out our comprehensive examples:
- Basic API - Simple AI API with chat functionality
- Advanced Chatbot - Full-featured chatbot with personality
- n8n Integration - Complete n8n workflow examples
- Custom Models - Implementing your own AI models
📖 Documentation
🤝 Contributing
We welcome contributions! Please see our Contributing Guide for details.
# Development setup
git clone https://github.com/nzrapi/nzrapi.git
cd nzrapi
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
black .
isort .
flake8
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Built on the excellent Starlette foundations
- Designed for seamless n8n integration
- Community-driven development
🔗 Links
- Homepage: https://nzrapi.dev
- Documentation: https://nzrapi.readthedocs.io
- PyPI: https://pypi.org/project/nzrapi/
- GitHub: https://github.com/nzrapi/nzrapi
- Discord: https://discord.gg/nzrapi
Built with ❤️ for the AI community
nzrApi Framework - Making AI APIs Simple and Powerful
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nzrapi-0.3.0.tar.gz.
File metadata
- Download URL: nzrapi-0.3.0.tar.gz
- Upload date:
- Size: 141.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9ff81425f6acc24a26bb737e35dcfcec554502b3e285ec67dedd89a09b10c9c0
|
|
| MD5 |
5996db8509c8f9056b5f8d44dbda53d5
|
|
| BLAKE2b-256 |
4b21e193e5001213a754a3936ab3f6e43f581e3cdae2028a956fe1a1f9fc1c13
|
File details
Details for the file nzrapi-0.3.0-py3-none-any.whl.
File metadata
- Download URL: nzrapi-0.3.0-py3-none-any.whl
- Upload date:
- Size: 75.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e8ea25060cfdc084bddbe7e752d82223cd75f592d96aa6126622e111553abb9f
|
|
| MD5 |
e1ecdf20b38cb7efa5f001bd87dfa992
|
|
| BLAKE2b-256 |
0f99b9d222d0d47f0787b21cc0e69844bc49a5d1e845f8e12d93467ce598c75d
|