Python client library for the HelpingAI API
Project description
HelpingAI Python SDK
The official Python library for the HelpingAI API - Advanced AI with Emotional Intelligence
๐ Features
- OpenAI-Compatible API: Drop-in replacement with familiar interface
- Emotional Intelligence: Advanced AI models with emotional understanding
- Streaming Support: Real-time response streaming
- Comprehensive Error Handling: Detailed error types and retry mechanisms
- Type Safety: Full type hints and IDE support
- Flexible Configuration: Environment variables and direct initialization
๐ฆ Installation
pip install HelpingAI
๐ Authentication
Get your API key from the HelpingAI Dashboard.
Environment Variable (Recommended)
export HAI_API_KEY='your-api-key'
Direct Initialization
from HelpingAI import HAI
hai = HAI(api_key='your-api-key')
๐ฏ Quick Start
from HelpingAI import HAI
# Initialize client
hai = HAI()
# Create a chat completion
response = hai.chat.completions.create(
model="Helpingai3-raw",
messages=[
{"role": "system", "content": "You are an expert in emotional intelligence."},
{"role": "user", "content": "What makes a good leader?"}
]
)
print(response.choices[0].message.content)
๐ Streaming Responses
# Stream responses in real-time
for chunk in hai.chat.completions.create(
model="Helpingai3-raw",
messages=[{"role": "user", "content": "Tell me about empathy"}],
stream=True
):
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
โ๏ธ Advanced Configuration
Parameter Control
response = hai.chat.completions.create(
model="Dhanishtha-2.0-preview",
messages=[{"role": "user", "content": "Write a story about empathy"}],
temperature=0.7, # Controls randomness (0-1)
max_tokens=500, # Maximum length of response
top_p=0.9, # Nucleus sampling parameter
frequency_penalty=0.3, # Reduces repetition
presence_penalty=0.3, # Encourages new topics
hide_think=True # Filter out reasoning blocks
)
Client Configuration
hai = HAI(
api_key="your-api-key",
base_url="https://api.helpingai.co/v1", # Custom base URL
timeout=30.0, # Request timeout
organization="your-org-id" # Organization ID
)
๐ก๏ธ Error Handling
from HelpingAI import HAI, HAIError, RateLimitError, InvalidRequestError
import time
def make_completion_with_retry(messages, max_retries=3):
for attempt in range(max_retries):
try:
return hai.chat.completions.create(
model="Helpingai3-raw",
messages=messages
)
except RateLimitError as e:
if attempt == max_retries - 1:
raise
time.sleep(e.retry_after or 1)
except InvalidRequestError as e:
print(f"Invalid request: {str(e)}")
raise
except HAIError as e:
print(f"API error: {str(e)}")
raise
๐ค Available Models
Helpingai3-raw
- Advanced Emotional Intelligence: Enhanced emotional understanding and contextual awareness
- Training Data: 15M emotional dialogues, 3M therapeutic exchanges, 250K cultural conversations, 1M crisis response scenarios
- Best For: AI companionship, emotional support, therapy guidance, personalized learning
Dhanishtha-2.0-preview
- World's First Intermediate Thinking Model: Multi-phase reasoning with self-correction capabilities
- Unique Features:
<think>...</think>blocks for transparent reasoning, structured emotional reasoning (SER) - Best For: Complex problem-solving, analytical tasks, educational content, reasoning-heavy applications
# List all available models
models = hai.models.list()
for model in models:
print(f"Model: {model.id} - {model.description}")
# Get specific model info
model = hai.models.retrieve("Helpingai3-raw")
print(f"Model: {model.name}")
# Use Dhanishtha-2.0 for complex reasoning
response = hai.chat.completions.create(
model="Dhanishtha-2.0-preview",
messages=[{"role": "user", "content": "Solve this step by step: What's 15% of 240?"}],
hide_think=False # Show reasoning process
)
๐ Documentation
Comprehensive documentation is available:
- ๐ Getting Started Guide - Installation and basic usage
- ๐ง API Reference - Complete API documentation
- ๐ก Examples - Code examples and use cases
- โ FAQ - Frequently asked questions
๐๏ธ Project Structure
HelpingAI-python/
โโโ HelpingAI/ # Main package
โ โโโ __init__.py # Package initialization
โ โโโ client.py # Main HAI client
โ โโโ models.py # Model management
โ โโโ base_models.py # Data models
โ โโโ error.py # Exception classes
โ โโโ version.py # Version information
โโโ docs/ # Documentation
โโโ tests/ # Test suite
โโโ setup.py # Package configuration
โโโ README.md # This file
๐ง Requirements
- Python: 3.7-3.14
- Dependencies:
requests- HTTP clienttyping_extensions- Type hints support
๐ค Contributing
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Support & Community
- Issues: GitHub Issues
- Documentation: HelpingAI Docs
- Dashboard: HelpingAI Dashboard
- Email: varun@helpingai.co
๐ What's New in v1.1.0
- Extended Python Support: Now supports Python 3.7-3.14
- Updated Models: Support for latest models (Helpingai3-raw, Dhanishtha-2.0-preview)
- Dhanishtha-2.0 Integration: World's first intermediate thinking model with multi-phase reasoning
- HelpingAI3 Support: Enhanced emotional intelligence with advanced contextual awareness
- Improved Model Management: Better fallback handling and detailed model descriptions
- OpenAI-Compatible Interface: Familiar API design
- Enhanced Error Handling: Comprehensive exception types
- Streaming Support: Real-time response streaming
- Advanced Filtering: Hide reasoning blocks with
hide_thinkparameter
Built with โค๏ธ by the HelpingAI Team
Empowering AI with Emotional Intelligence
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file helpingai-1.1.2.tar.gz.
File metadata
- Download URL: helpingai-1.1.2.tar.gz
- Upload date:
- Size: 18.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
56c2b9a9ae60b8c62825a29a23289011c9e2b0a7f9e57ed3c13bc4538910f156
|
|
| MD5 |
311e0406c2fee2f06e8a4da890af4bef
|
|
| BLAKE2b-256 |
186f09a4f2ceba92fff3cacf7988a1b4622e7d572d18810dc151473fe60d468d
|
File details
Details for the file helpingai-1.1.2-py3-none-any.whl.
File metadata
- Download URL: helpingai-1.1.2-py3-none-any.whl
- Upload date:
- Size: 20.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ec16e245f8d6ea5e5ff43afa15d9178e0541fbf8d65658c37386dea0fa34200
|
|
| MD5 |
45317a53d2423d0d580baae14e893983
|
|
| BLAKE2b-256 |
4870880fe23f81008d29b7e34355a31fd8940f8143255ad3ae31849a64b4a609
|