Skip to main content

A unified client for various AI providers

Project description

IndoxRouter Client

A unified client for various AI providers, including OpenAI, Anthropic, Google, and Mistral.

Features

  • Unified API: Access multiple AI providers through a single API
  • Simple Interface: Easy-to-use methods for chat, completion, embeddings, and image generation
  • Error Handling: Standardized error handling across providers
  • Authentication: Secure cookie-based authentication

Installation

pip install indoxrouter

Usage

Initialization

from indoxrouter import Client

# Initialize with API key
client = Client(api_key="your_api_key")

# Using environment variables
# Set INDOX_ROUTER_API_KEY environment variable
import os
os.environ["INDOX_ROUTER_API_KEY"] = "your_api_key"
client = Client()

# Connect to a custom server
client = Client(
    api_key="your_api_key",
    base_url="https://your-indoxrouter-server.com"
)

Authentication

IndoxRouter uses cookie-based authentication, which securely transmits your API key in cookies rather than headers. This is handled automatically by the client.

# Authentication is handled automatically when creating the client
client = Client(api_key="your_api_key")

Note: The use_cookies parameter is kept for backward compatibility but should always be set to True as the server no longer supports header-based authentication.

Chat Completions

response = client.chat(
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Tell me a joke."}
    ],
    model="openai/gpt-4o-mini",  # Provider/model format
    temperature=0.7
)

print(response["choices"][0]["message"]["content"])

Text Completions

response = client.completion(
    prompt="Once upon a time,",
    model="openai/gpt-4o-mini",
    max_tokens=100
)

print(response["choices"][0]["text"])

Embeddings

response = client.embeddings(
    text=["Hello world", "AI is amazing"],
    model="openai/text-embedding-3-small"
)

print(f"Dimensions: {len(response['data'][0]['embedding'])}")
print(f"First embedding: {response['data'][0]['embedding'][:5]}...")

Image Generation

response = client.images(
    prompt="A serene landscape with mountains and a lake",
    model="openai/dall-e-3",
    size="1024x1024"
)

print(f"Image URL: {response['data'][0]['url']}")

Streaming Responses

for chunk in client.chat(
    messages=[{"role": "user", "content": "Write a short story."}],
    model="openai/gpt-4o-mini",
    stream=True
):
    if chunk.get("choices") and len(chunk["choices"]) > 0:
        content = chunk["choices"][0].get("delta", {}).get("content", "")
        print(content, end="", flush=True)

Getting Available Models

# Get all providers and models
providers = client.models()
for provider in providers:
    print(f"Provider: {provider['name']}")
    for model in provider["models"]:
        print(f"  - {model['id']}: {model['description'] or ''}")

# Get models for a specific provider
openai_provider = client.models("openai")
print(f"OpenAI models: {[m['id'] for m in openai_provider['models']]}")

Error Handling

from indoxrouter import Client, ModelNotFoundError, ProviderError

try:
    client = Client(api_key="your_api_key")
    response = client.chat(
        messages=[{"role": "user", "content": "Hello"}],
        model="nonexistent-provider/nonexistent-model"
    )
except ModelNotFoundError as e:
    print(f"Model not found: {e}")
except ProviderError as e:
    print(f"Provider error: {e}")

Context Manager

with Client(api_key="your_api_key") as client:
    response = client.chat(
        messages=[{"role": "user", "content": "Hello!"}],
        model="openai/gpt-4o-mini"
    )
    print(response["choices"][0]["message"]["content"])
# Client is automatically closed when exiting the block

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

indoxrouter-0.1.9.tar.gz (20.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

indoxrouter-0.1.9-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file indoxrouter-0.1.9.tar.gz.

File metadata

  • Download URL: indoxrouter-0.1.9.tar.gz
  • Upload date:
  • Size: 20.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for indoxrouter-0.1.9.tar.gz
Algorithm Hash digest
SHA256 726deef0772761b216386f7f4f8e47b94de742048df751c6477e56b73ca0b005
MD5 ff7b56243f341f9208545516665b0daf
BLAKE2b-256 93567491ab777c4af641a323ef5daed312d71ed41b016a3186612c56bb0920ce

See more details on using hashes here.

File details

Details for the file indoxrouter-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: indoxrouter-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for indoxrouter-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 419844693356c54052ed1141ddb6111f25402cad591fe427f554de18840b75de
MD5 f40085dccf36b36b1ed99228d60968a8
BLAKE2b-256 ca0c3bc8019820d2db352eaa30aa54419dd792fcd51ee3242cc98d801780dd8a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page