Skip to main content

Python client library for the HelpingAI API

Project description

HelpingAI Python SDK

The official Python library for the HelpingAI API - Advanced AI with Emotional Intelligence

PyPI version Python Versions License: MIT

๐Ÿš€ Features

  • OpenAI-Compatible API: Drop-in replacement with familiar interface
  • Emotional Intelligence: Advanced AI models with emotional understanding
  • Streaming Support: Real-time response streaming
  • Comprehensive Error Handling: Detailed error types and retry mechanisms
  • Type Safety: Full type hints and IDE support
  • Flexible Configuration: Environment variables and direct initialization

๐Ÿ“ฆ Installation

pip install HelpingAI

๐Ÿ”‘ Authentication

Get your API key from the HelpingAI Dashboard.

Environment Variable (Recommended)

export HAI_API_KEY='your-api-key'

Direct Initialization

from HelpingAI import HAI

hai = HAI(api_key='your-api-key')

๐ŸŽฏ Quick Start

from HelpingAI import HAI

# Initialize client
hai = HAI()

# Create a chat completion
response = hai.chat.completions.create(
    model="Helpingai3-raw",
    messages=[
        {"role": "system", "content": "You are an expert in emotional intelligence."},
        {"role": "user", "content": "What makes a good leader?"}
    ]
)

print(response.choices[0].message.content)

๐ŸŒŠ Streaming Responses

# Stream responses in real-time
for chunk in hai.chat.completions.create(
    model="Helpingai3-raw",
    messages=[{"role": "user", "content": "Tell me about empathy"}],
    stream=True
):
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

โš™๏ธ Advanced Configuration

Parameter Control

response = hai.chat.completions.create(
    model="Dhanishtha-2.0-preview",
    messages=[{"role": "user", "content": "Write a story about empathy"}],
    temperature=0.7,        # Controls randomness (0-1)
    max_tokens=500,        # Maximum length of response
    top_p=0.9,            # Nucleus sampling parameter
    frequency_penalty=0.3, # Reduces repetition
    presence_penalty=0.3,  # Encourages new topics
    hide_think=True       # Filter out reasoning blocks
)

Client Configuration

hai = HAI(
    api_key="your-api-key",
    base_url="https://api.helpingai.co/v1",  # Custom base URL
    timeout=30.0,                            # Request timeout
    organization="your-org-id"               # Organization ID
)

๐Ÿ›ก๏ธ Error Handling

from HelpingAI import HAI, HAIError, RateLimitError, InvalidRequestError
import time

def make_completion_with_retry(messages, max_retries=3):
    for attempt in range(max_retries):
        try:
            return hai.chat.completions.create(
                model="Helpingai3-raw",
                messages=messages
            )
        except RateLimitError as e:
            if attempt == max_retries - 1:
                raise
            time.sleep(e.retry_after or 1)
        except InvalidRequestError as e:
            print(f"Invalid request: {str(e)}")
            raise
        except HAIError as e:
            print(f"API error: {str(e)}")
            raise

๐Ÿค– Available Models

Helpingai3-raw

  • Advanced Emotional Intelligence: Enhanced emotional understanding and contextual awareness
  • Training Data: 15M emotional dialogues, 3M therapeutic exchanges, 250K cultural conversations, 1M crisis response scenarios
  • Best For: AI companionship, emotional support, therapy guidance, personalized learning

Dhanishtha-2.0-preview

  • World's First Intermediate Thinking Model: Multi-phase reasoning with self-correction capabilities
  • Unique Features: <think>...</think> blocks for transparent reasoning, structured emotional reasoning (SER)
  • Best For: Complex problem-solving, analytical tasks, educational content, reasoning-heavy applications
# List all available models
models = hai.models.list()
for model in models:
    print(f"Model: {model.id} - {model.description}")

# Get specific model info
model = hai.models.retrieve("Helpingai3-raw")
print(f"Model: {model.name}")

# Use Dhanishtha-2.0 for complex reasoning
response = hai.chat.completions.create(
    model="Dhanishtha-2.0-preview",
    messages=[{"role": "user", "content": "Solve this step by step: What's 15% of 240?"}],
    hide_think=False  # Show reasoning process
)

๐Ÿ“š Documentation

Comprehensive documentation is available:

๐Ÿ—๏ธ Project Structure

HelpingAI-python/
โ”œโ”€โ”€ HelpingAI/              # Main package
โ”‚   โ”œโ”€โ”€ __init__.py         # Package initialization
โ”‚   โ”œโ”€โ”€ client.py           # Main HAI client
โ”‚   โ”œโ”€โ”€ models.py           # Model management
โ”‚   โ”œโ”€โ”€ base_models.py      # Data models
โ”‚   โ”œโ”€โ”€ error.py            # Exception classes
โ”‚   โ””โ”€โ”€ version.py          # Version information
โ”œโ”€โ”€ docs/                   # Documentation
โ”œโ”€โ”€ tests/                  # Test suite
โ”œโ”€โ”€ setup.py               # Package configuration
โ””โ”€โ”€ README.md              # This file

๐Ÿ”ง Requirements

  • Python: 3.7-3.14
  • Dependencies:
    • requests - HTTP client
    • typing_extensions - Type hints support

๐Ÿค Contributing

We welcome contributions! Please see our Contributing Guide for details.

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ†˜ Support & Community

๐Ÿš€ What's New in v1.1.0

  • Extended Python Support: Now supports Python 3.7-3.14
  • Updated Models: Support for latest models (Helpingai3-raw, Dhanishtha-2.0-preview)
  • Dhanishtha-2.0 Integration: World's first intermediate thinking model with multi-phase reasoning
  • HelpingAI3 Support: Enhanced emotional intelligence with advanced contextual awareness
  • Improved Model Management: Better fallback handling and detailed model descriptions
  • OpenAI-Compatible Interface: Familiar API design
  • Enhanced Error Handling: Comprehensive exception types
  • Streaming Support: Real-time response streaming
  • Advanced Filtering: Hide reasoning blocks with hide_think parameter

Built with โค๏ธ by the HelpingAI Team

Empowering AI with Emotional Intelligence

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

helpingai-1.1.0.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

helpingai-1.1.0-py3-none-any.whl (19.5 kB view details)

Uploaded Python 3

File details

Details for the file helpingai-1.1.0.tar.gz.

File metadata

  • Download URL: helpingai-1.1.0.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for helpingai-1.1.0.tar.gz
Algorithm Hash digest
SHA256 ed405780f49360c775e0a5cb9a4cb7e62519cbf430ca0e8e7230957d3d9e622e
MD5 83d113694c3a4fb59a7420b977a9914d
BLAKE2b-256 3913069cb512f63acc30cb4f9c3c8dbacba2575a5e303d6a7db5b12d1ef2fd1a

See more details on using hashes here.

File details

Details for the file helpingai-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: helpingai-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 19.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for helpingai-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1afca0556159e7b4b43d52442ee66e2c6ad8d128759645a1a306f87dcba0caa3
MD5 007fc8ade9e01ae6a86b3144288b281a
BLAKE2b-256 986e0de0c6479adc6645d8262d219fa0777870504fcc74ad15e09685a3ca1c71

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page