Integrate neoapi.ai LLM Analytics with your LLM pipelines.
Project description
NeoAPI SDK
The official Python SDK for integrating neoapi.ai LLM Analytics with your LLM pipelines. Track, analyze, and optimize your Language Model outputs with real-time analytics.
Installation
pip install neoapi-sdk
Quick Start Guide
First, set your API key as an environment variable:
export NEOAPI_API_KEY="your-api-key"
Basic Usage
from neoapi import NeoApiClientSync, track_llm_output
# The context manager handles client lifecycle automatically
with NeoApiClientSync() as client:
# Decorate your LLM function to track its outputs
@track_llm_output(client=client)
def get_llm_response(prompt: str) -> str:
# Your LLM logic here
return "AI generated response"
# Use your function normally
response = get_llm_response("What is machine learning?")
Async Support
import asyncio
from neoapi import NeoApiClientAsync, track_llm_output
async def main():
async with NeoApiClientAsync() as client:
@track_llm_output(
client=client,
project="chatbot",
need_analysis_response=True # Get analytics feedback
)
async def get_llm_response(prompt: str) -> str:
# Your async LLM logic here
await asyncio.sleep(0.1) # Simulated API call
return "Async AI response"
response = await get_llm_response("Explain async programming")
# Run your async code
asyncio.run(main())
OpenAI Integration Example
from openai import OpenAI
from neoapi import NeoApiClientSync, track_llm_output
def chat_with_gpt():
openai_client = OpenAI() # Uses OPENAI_API_KEY env variable
with NeoApiClientSync() as neo_client:
@track_llm_output(
client=neo_client,
project="gpt4_chat",
need_analysis_response=True, # Get quality metrics
format_json_output=True # Pretty-print analytics
)
def ask_gpt(prompt: str) -> str:
response = openai_client.chat.completions.create(
messages=[{"role": "user", "content": prompt}],
model="gpt-4o-mini"
)
return response.choices[0].message.content
# Use the tracked function
response = ask_gpt("What are the key principles of clean code?")
print(response) # Analytics will be logged automatically
Key Features
- 🔄 Automatic Tracking: Decorator-based output monitoring
- ⚡ Async Support: Built for high-performance async applications
- 🔍 Real-time Analytics: Get immediate feedback on output quality
- 🛠 Flexible Integration: Works with any LLM provider
- 🔧 Configurable: Extensive customization options
- 🔐 Secure: Environment-based configuration
Configuration Options
Environment Variables
# Required
export NEOAPI_API_KEY="your-api-key"
### Client Configuration
```python
client = NeoApiClientSync(
# Basic settings
api_key="your-api-key", # Optional if env var is set
check_frequency=1, # Process every Nth output
# Performance tuning
batch_size=10, # Outputs per batch
flush_interval=5.0, # Seconds between flushes
max_retries=3, # Retry attempts on failure
# Advanced options
api_url="custom-url", # Optional API endpoint
max_batch_size=100, # Maximum batch size
)
Decorator Options
@track_llm_output(
client=client,
# Organization
project="my_project", # Project identifier
group="experiment_a", # Subgroup within project
analysis_slug="v1.2", # Version or analysis identifier
# Analytics
need_analysis_response=True, # Get quality metrics
format_json_output=True, # Pretty-print analytics
# Custom data
metadata={ # Additional tracking info
"model": "gpt-4",
"temperature": 0.7,
"user_id": "user123"
},
save_text=True # Store output text
)
Best Practices
-
Use Context Managers: They handle client lifecycle automatically
with NeoApiClientSync() as client: # Your code here
-
Group Related Outputs: Use project and group parameters
@track_llm_output(client=client, project="chatbot", group="user_support")
-
Add Relevant Metadata: Include context for better analysis
@track_llm_output( client=client, metadata={"user_type": "premium", "session_id": "abc123"} )
Resources
License
Apache License 2.0 - See LICENSE for details
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
neoapi-sdk-0.1.3.tar.gz
(10.6 kB
view details)
Built Distribution
File details
Details for the file neoapi-sdk-0.1.3.tar.gz
.
File metadata
- Download URL: neoapi-sdk-0.1.3.tar.gz
- Upload date:
- Size: 10.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4e52412ebf92159a91fe8afff9395c91ee5f47d9c5ccfe5919ef23b4b280fcf8 |
|
MD5 | 117f252d1e902ef985a29045bd4db8e6 |
|
BLAKE2b-256 | d590c4cae0363f7001bc4f89e7f40cf740503565d1c142f1db648c61c7ff9eb7 |
File details
Details for the file neoapi_sdk-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: neoapi_sdk-0.1.3-py3-none-any.whl
- Upload date:
- Size: 11.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 12dd0a53c7f5a28b89f446a1d44e5c53402ac1d34bfd06637e3095180ef03bb0 |
|
MD5 | dd59da7d1c4fa2302730058cf44b903b |
|
BLAKE2b-256 | 5c008c42923d5208d812d3760f66fca228e53ebf2870933bb415ad9603f067d3 |