Skip to main content

⚡ Python AI SDK - OpenAI, MistralAI, Anthropic, Google Gemini, XAI Grok, Hugging Face

Project description

AIHuber

aihuber is the easiest and quickest way to communicate with AI providers. Free and open-source.

License Python Build Status Version

✨ Features

aihuber is compatible with main AI providers :

Chat completion (Text generation)

AI Providers Comparison
AI Provider Buffering sync & async Streaming (sync & async) Endpoint
OpenAI https://api.openai.com/v1/chat/completions
Mistral AI https://api.mistral.ai/v1/chat/completions
Anthropic https://api.anthropic.com/v1/messages
Google (Gemini) https://generativelanguage.googleapis.com/v1/models/{model}:generateContent
Cohere https://api.cohere.com/v2/chat
Together AI https://api.together.xyz/v1/chat/completions
Replicate https://api.replicate.com/v1/predictions
Hugging Face https://api-inference.huggingface.co/models/{model_id}
Azure OpenAI https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-name}/chat/completions?api-version=2024-02-01
AWS Bedrock https://bedrock-runtime.{region}.amazonaws.com/model/{model-id}/invoke
Perplexity https://api.perplexity.ai/chat/completions
xAI (Grok) https://api.x.ai/v1/chat/completions

Legend

  • ✅ = Available
  • ❌ = Not Available
  • Text in parentheses = Specific service/model name

🚀 Quick Start

Installation

pip install aihuber

Or with uv:

uv add aihuber

Basic Usage

import asyncio
import os

from dotenv import load_dotenv

from aihuber import LLM, Message
from pydantic import SecretStr

load_dotenv()

MISTRAL_AI_TOKEN = SecretStr(os.getenv("MISTRAL_AI_TOKEN"))  # type: ignore


async def main():
    """Chat completion example with asynchronous"""
    llm = LLM(model="mistral:mistral-small-latest", api_key=MISTRAL_AI_TOKEN)

    messages = [
        Message(role="system", content="You are a helpful Python developer assistant."),
        Message(role="user", content="Give me 10 numbers from 0"),
    ]

    async_chat_completion = await llm.chat_completion_async(messages, stream=True)
    async for chunk in async_chat_completion:
        print(chunk)

    async_chat_completion = await llm.chat_completion_async(messages, stream=True)
    async for chunk in async_chat_completion:
        print(chunk)


if __name__ == "__main__":
    asyncio.run(main())

🐳 Docker Usage

docker build -t aihuber -f dockerfiles/aihuber.Dockerfile .
docker run  -p 8080:8080 aihuber

📄 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aihuber-0.1.0.tar.gz (113.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aihuber-0.1.0-py3-none-any.whl (20.0 kB view details)

Uploaded Python 3

File details

Details for the file aihuber-0.1.0.tar.gz.

File metadata

  • Download URL: aihuber-0.1.0.tar.gz
  • Upload date:
  • Size: 113.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aihuber-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a770fc4c93d56249ef64fe603b923f9979baf19ac3bc6f6db91e84e02eab0a6a
MD5 34e399df58b4493327251151d71f1ee9
BLAKE2b-256 b5363fb42352288cba9039d6c7f2c79643911290fd142791ec55e2374061baf3

See more details on using hashes here.

File details

Details for the file aihuber-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: aihuber-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 20.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aihuber-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bc58fb0dabd4a5de3a573f190240ba088339a0d282069f134249837c8e7b01e1
MD5 a85388a3a8d634cd9c20e5e709ec2b86
BLAKE2b-256 814bb6f5c952b30e07fbd8cf3813cd758ea62165eb5f6bcfb098c952f59aea4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page