Skip to main content

Simple unified API for multiple AI services.

Project description

ClientAI

ClientAI logo

A unified client for seamless interaction with multiple AI providers.

Tests PyPi Version Supported Python Versions


ClientAI is a Python package that provides a unified interface for interacting with multiple AI providers, including OpenAI, Replicate, Groq and Ollama. It offers seamless integration and consistent methods for text generation and chat functionality across different AI platforms.

Documentation: igorbenav.github.io/clientai/


Features

  • 🔄 Unified Interface: Consistent methods for text generation and chat across multiple AI providers.
  • 🔌 Multiple Providers: Support for OpenAI, Replicate, Groq and Ollama, with easy extensibility for future providers.
  • 🌊 Streaming Support: Efficient streaming of responses for real-time applications.
  • 🎛️ Flexible Configuration: Easy setup with provider-specific configurations.
  • 🔧 Customizable: Extensible design for adding new providers or customizing existing ones.
  • 🧠 Type Hinting: Comprehensive type annotations for better development experience.
  • 🔒 Provider Isolation: Optional installation of provider-specific dependencies to keep your environment lean.

Requirements

Before installing ClientAI, ensure you have the following:

  • Python: Version 3.9 or newer.
  • Dependencies: The core ClientAI package has minimal dependencies. Provider-specific packages (e.g., openai, replicate, ollama, groq) are optional and can be installed separately.

Installing

To install ClientAI with all providers, run:

pip install clientai[all]

Or, if you prefer to install only specific providers:

pip install clientai[openai]  # For OpenAI support
pip install clientai[replicate]  # For Replicate support
pip install clientai[ollama]  # For Ollama support
pip install clientai[groq]  # For Groq support

Usage

ClientAI provides a simple and consistent way to interact with different AI providers. Here are some examples:

Initializing the Client

from clientai import ClientAI

# Initialize with OpenAI
openai_client = ClientAI('openai', api_key="your-openai-key")

# Initialize with Replicate
replicate_client = ClientAI('replicate', api_key="your-replicate-key")

# Initialize with Ollama
ollama_client = ClientAI('ollama', host="your-ollama-host")

# Initialize with Groq
groq_client = ClientAI('groq', host="your-groq-key")

Generating Text

# Using OpenAI
response = openai_client.generate_text(
    "Tell me a joke",
    model="gpt-3.5-turbo",
)

# Using Replicate
response = replicate_client.generate_text(
    "Explain quantum computing",
    model="meta/llama-2-70b-chat:latest",
)

# Using Ollama
response = ollama_client.generate_text(
    "What is the capital of France?",
    model="llama2",
)

# Using Groq
response = groq_client.generate_text(
    "Who was the first US president?",
    model="llama3-8b-8192",
)

Chat Functionality

messages = [
    {"role": "user", "content": "What is the capital of France?"},
    {"role": "assistant", "content": "Paris."},
    {"role": "user", "content": "What is its population?"}
]

# Using OpenAI
response = openai_client.chat(
    messages,
    model="gpt-3.5-turbo",
)

# Using Replicate
response = replicate_client.chat(
    messages,
    model="meta/llama-2-70b-chat:latest",
)

# Using Ollama
response = ollama_client.chat(
    messages,
    model="llama2",
)

# Using Groq
response = groq_client.chat(
    messages,
    model="llama3-8b-8192",
)

Streaming Responses

for chunk in client.generate_text(
    "Tell me a long story",
    model="gpt-3.5-turbo",
    stream=True
):
    print(chunk, end="", flush=True)

Contributing

Contributions to ClientAI are welcome! Please refer to our Contributing Guidelines for more information.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Igor Magalhaes – @igormagalhaesrigormagalhaesr@gmail.com github.com/igorbenav

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clientai-0.3.2.tar.gz (22.2 kB view details)

Uploaded Source

Built Distribution

clientai-0.3.2-py3-none-any.whl (31.1 kB view details)

Uploaded Python 3

File details

Details for the file clientai-0.3.2.tar.gz.

File metadata

  • Download URL: clientai-0.3.2.tar.gz
  • Upload date:
  • Size: 22.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Darwin/23.6.0

File hashes

Hashes for clientai-0.3.2.tar.gz
Algorithm Hash digest
SHA256 9993d35b08dbd44320e9a62be7bf2b4aa85e5137465f761bc7cfbb0464059d66
MD5 b72ab017cc969827b1a77e1a5076f9f8
BLAKE2b-256 78f0ad4a721fd3a8068e6dbfb2d1a2a53e445aa85e01af9d1b809a3dfab3e84e

See more details on using hashes here.

File details

Details for the file clientai-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: clientai-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 31.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Darwin/23.6.0

File hashes

Hashes for clientai-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b06e7cc4f76a6832b4f128c5252059166927b9f9bcb58e26535d5cbe8e77336e
MD5 138ce0ae956e45ea070f85f4b6179963
BLAKE2b-256 c854deceaa50fe9ce0ab51cab49417685fc42cdf2606e8573e02684df6aad8e4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page