Skip to main content

⚡ Python AI SDK - OpenAI, MistralAI, Anthropic, Google Gemini, XAI Grok, Hugging Face

Project description

AIHuber

aihuber is the easiest and quickest way to communicate with AI providers. Free and open-source.

License Python Build Status Version

✨ Features

aihuber is compatible with main AI providers :

Chat completion (Text generation)

AI Providers Comparison
AI Provider Buffering sync & async Streaming (sync & async) Endpoint
OpenAI https://api.openai.com/v1/chat/completions
Mistral AI https://api.mistral.ai/v1/chat/completions
Anthropic https://api.anthropic.com/v1/messages
Google (Gemini) https://generativelanguage.googleapis.com/v1/models/{model}:generateContent
Cohere https://api.cohere.com/v2/chat
Together AI https://api.together.xyz/v1/chat/completions
Replicate https://api.replicate.com/v1/predictions
Hugging Face https://api-inference.huggingface.co/models/{model_id}
Azure OpenAI https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-name}/chat/completions?api-version=2024-02-01
AWS Bedrock https://bedrock-runtime.{region}.amazonaws.com/model/{model-id}/invoke
Perplexity https://api.perplexity.ai/chat/completions
xAI (Grok) https://api.x.ai/v1/chat/completions

Legend

  • ✅ = Available
  • ❌ = Not Available
  • Text in parentheses = Specific service/model name

🚀 Quick Start

Installation

pip install aihuber

Or with uv:

uv add aihuber

Basic Usage

import asyncio
import os

from dotenv import load_dotenv

from aihuber import LLM, Message
from pydantic import SecretStr

load_dotenv()

MISTRAL_AI_TOKEN = SecretStr(os.getenv("MISTRAL_AI_TOKEN"))  # type: ignore


async def main():
    """Chat completion example with asynchronous"""
    llm = LLM(model="mistral:mistral-small-latest", api_key=MISTRAL_AI_TOKEN)

    messages = [
        Message(role="system", content="You are a helpful Python developer assistant."),
        Message(role="user", content="Give me 10 numbers from 0"),
    ]

    async_chat_completion = await llm.chat_completion_async(messages, stream=True)
    async for chunk in async_chat_completion:
        print(chunk)

if __name__ == "__main__":
    asyncio.run(main())

📄 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aihuber-0.3.0.tar.gz (113.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aihuber-0.3.0-py3-none-any.whl (20.0 kB view details)

Uploaded Python 3

File details

Details for the file aihuber-0.3.0.tar.gz.

File metadata

  • Download URL: aihuber-0.3.0.tar.gz
  • Upload date:
  • Size: 113.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aihuber-0.3.0.tar.gz
Algorithm Hash digest
SHA256 498353b9b871ef8df1745fbc0e3effee20e05028b9af5907c2bf7cfe21f5a975
MD5 f8dc3556b8b9e31820ec2eb904d8fe65
BLAKE2b-256 d2cda8f4de78535662c1a729b863cd73ca53a41a189b1a5b51e40e52e0e06b04

See more details on using hashes here.

File details

Details for the file aihuber-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: aihuber-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 20.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aihuber-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9a3b131171bcb4e0f63e0be55fa90701a64637922477a1b9872d716d272f8ca6
MD5 d505b0bb14fc43be3ec78a94cf03e853
BLAKE2b-256 4f2264e2a03c7bf58fee56cc217018ce229823e8d6591a32e7be32115bce164b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page