A powerful, OpenAI-compatible Python SDK for TNSA NGen3 Pro and Lite Models
Project description
TNSA API Python Client
A powerful, OpenAI-compatible Python SDK for TNSA NGen3 Pro and Lite Models with MCP (Machine Control Protocol) integration.
Features
- 🚀 OpenAI-Compatible API - Familiar interface for easy migration
- ⚡ Async & Sync Support - Both synchronous and asynchronous clients
- 🌊 Streaming Responses - Real-time token streaming for interactive applications
- 🔧 Comprehensive Error Handling - Robust error handling with retry logic
- 📊 Usage Tracking - Built-in token counting and cost estimation
- 💬 Conversation Management - Automatic chat history and context management
- 🔒 Secure Authentication - API key management with environment variable support
- 📝 Type Safety - Full type hints for better IDE support
- 🎯 Framework Integration - Works seamlessly with FastAPI, Django, and more
- 🤖 MCP Client - Integrated Machine Control Protocol client for tool and service integration
Installation
pip install tnsa-api
Quick Start
Basic Usage
from tnsa_api_v2 import TNSA
# Initialize the client
client = TNSA(api_key="your-api-key")
# List available models
models = client.models.list()
print("Available models:", [model.id for model in models])
# Create a chat completion
response = client.chat.completions.create(
model="NGen3.9-Pro",
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
print(response.choices[0].message.content)
MCP Client Usage
The TNSA SDK includes a client for the Machine Control Protocol (MCP), enabling integration with various tools and services.
import asyncio
from tnsa_api_v2 import MCPClient
async def run_mcp_example():
async with MCPClient(
server_url="https://mcp.example.com",
api_key="your-mcp-api-key" # Optional
) as client:
# List available tools
tools = await client.list_tools()
print("Available tools:", tools)
# Call a tool
if tools:
result = await client.call_tool(
tool_name=tools[0]["name"],
params={"param1": "value1"}
)
print("Tool result:", result)
# Run the example
asyncio.run(run_mcp_example())
For more details, see the MCP Client Documentation.
Streaming Example
# Streaming responses
stream = client.chat.completions.create(
model="NGen3.9-Lite",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Async Usage
import asyncio
from tnsa_api_v2 import AsyncTNSA
async def main():
client = AsyncTNSA(api_key="your-api-key")
response = await client.chat.completions.acreate(
model="NGen3.9-Pro",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
asyncio.run(main())
Configuration
The client can be configured using environment variables, configuration files, or direct parameters:
Environment Variables
export TNSA_API_KEY="your-api-key"
export TNSA_BASE_URL="https://api.tnsaai.com"
export TNSA_TIMEOUT=30.0
Configuration File (config.yaml)
api_key: "your-api-key"
base_url: "https://api.tnsaai.com"
timeout: 30.0
max_retries: 3
default_model: "NGen3.9-Pro"
Direct Parameters
client = TNSA(
api_key="your-api-key",
base_url="https://api.tnsaai.com",
timeout=30.0,
max_retries=3
)
Available Models
- NGen3.9-Pro - High-performance model for complex tasks
- NGen3.9-Lite - Fast, efficient model for general use
- NGen3-7B-0625 - Specialized model variant
- Farmvaidya-Bot - Agricultural domain-specific model
Error Handling
from tnsa_api_v2 import TNSAError, RateLimitError, AuthenticationError
try:
response = client.chat.completions.create(
model="NGen3.9-Pro",
messages=[{"role": "user", "content": "Hello!"}]
)
except AuthenticationError:
print("Invalid API key")
except RateLimitError:
print("Rate limit exceeded")
except TNSAError as e:
print(f"API error: {e}")
Documentation
Support
- 📧 Email: info@tnsaai.com
- 🐛 Issues: GitHub Issues
- 📖 Documentation: https://docs.tnsaai.com
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tnsa_api-1.1.0.tar.gz.
File metadata
- Download URL: tnsa_api-1.1.0.tar.gz
- Upload date:
- Size: 31.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4554e07be60fec6930455728274fe14672ef48c053d002d598bb3b6d2c854e3d
|
|
| MD5 |
e5a62a87549feb939f80c6a6a0371cc4
|
|
| BLAKE2b-256 |
1bc39cd6416ad955cef9e615f8884605f3ac067fbe8ca14d2ee796991c9da9e4
|
File details
Details for the file tnsa_api-1.1.0-py3-none-any.whl.
File metadata
- Download URL: tnsa_api-1.1.0-py3-none-any.whl
- Upload date:
- Size: 39.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cfb9c974d041d48f35a0f992e1177eef14fb693371bf747764e8f9021ca392a2
|
|
| MD5 |
11623cc76db2c98102a9230df030134c
|
|
| BLAKE2b-256 |
f3f83307f8aefca4ea5024d5c10d4a10a9f6f8ae8f5c3badb1dc9b8399029d4d
|