Skip to main content

A unified client for AI providers with built-in agent support.

Project description

ClientAI

ClientAI logo

A unified client for AI providers with built-in agent support.

Tests PyPi Version Supported Python Versions


ClientAI is a Python package that provides a unified framework for building AI applications, from direct provider interactions to transparent LLM-powered agents, with seamless support for OpenAI, Replicate, Groq and Ollama.

Documentation: igorbenav.github.io/clientai/


Features

  • Unified Interface: Consistent methods across multiple AI providers (OpenAI, Replicate, Groq, Ollama).
  • Streaming Support: Real-time response streaming and chat capabilities.
  • Intelligent Agents: Framework for building transparent, multi-step LLM workflows with tool integration.
  • Output Validation: Built-in validation system for ensuring structured, reliable outputs from each step.
  • Modular Design: Use components independently, from simple provider wrappers to complete agent systems.
  • Type Safety: Comprehensive type hints for better development experience.

Installing

To install ClientAI with all providers, run:

pip install "clientai[all]"

Or, if you prefer to install only specific providers:

pip install "clientai[openai]"  # For OpenAI support
pip install "clientai[replicate]"  # For Replicate support
pip install "clientai[ollama]"  # For Ollama support
pip install "clientai[groq]"  # For Groq support

Quick Start Examples

Basic Provider Usage

from clientai import ClientAI

# Initialize with OpenAI
client = ClientAI('openai', api_key="your-openai-key")

# Generate text
response = client.generate_text(
    "Tell me a joke",
    model="gpt-3.5-turbo",
)
print(response)

# Chat functionality
messages = [
    {"role": "user", "content": "What is the capital of France?"},
    {"role": "assistant", "content": "Paris."},
    {"role": "user", "content": "What is its population?"}
]

response = client.chat(
    messages,
    model="gpt-3.5-turbo",
)
print(response)

Quick-Start Agent

from clientai import client
from clientai.agent import create_agent, tool

@tool(name="calculator")
def calculate_average(numbers: list[float]) -> float:
    """Calculate the arithmetic mean of a list of numbers."""
    return sum(numbers) / len(numbers)

analyzer = create_agent(
    client=client("groq", api_key="your-groq-key"),
    role="analyzer", 
    system_prompt="You are a helpful data analysis assistant.",
    model="llama-3.2-3b-preview",
    tools=[calculate_average]
)

result = analyzer.run("Calculate the average of these numbers: [1000, 1200, 950, 1100]")
print(result)

3. Custom Agent with Validation

For guaranteed output structure and type safety:

from clientai.agent import Agent, think
from pydantic import BaseModel, Field
from typing import List

class Analysis(BaseModel):
    summary: str = Field(min_length=10)
    key_points: List[str] = Field(min_items=1)
    sentiment: str = Field(pattern="^(positive|negative|neutral)$")

class DataAnalyzer(Agent):
    @think(
        name="analyze",
        json_output=True,  # Enable JSON formatting
    )

    def analyze_data(self, data: str) -> Analysis: # Enable validation
        """Analyze data with validated output structure."""
        return """
        Analyze this data and return a JSON with:
        - summary: at least 10 characters
        - key_points: non-empty list
        - sentiment: positive, negative, or neutral

        Data: {data}
        """

# Initialize and use

analyzer = DataAnalyzer(client=client, default_model="gpt-4")
result = analyzer.run("Sales increased by 25% this quarter")
print(f"Sentiment: {result.sentiment}")
print(f"Key Points: {result.key_points}")

See our documentation for more examples, including:

  • Custom workflow agents with multiple steps
  • Complex tool integration and selection
  • Advanced usage patterns and best practices

Design Philosophy

The ClientAI Agent module is built on four core principles:

  1. Prompt-Centric Design: Prompts are explicit, debuggable, and transparent. What you see is what is sent to the model.

  2. Customization First: Every component is designed to be extended or overridden. Create custom steps, tool selectors, or entirely new workflow patterns.

  3. Zero Lock-In: Start with high-level components and drop down to lower levels as needed. You can:

    • Extend Agent for custom behavior
    • Use individual components directly
    • Gradually replace parts with your own implementation
    • Or migrate away entirely - no lock-in

Requirements

  • Python: Version 3.9 or newer
  • Dependencies: Core package has minimal dependencies. Provider-specific packages are optional.

Contributing

Contributions are welcome! Please see our Contributing Guidelines for more information.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Igor Magalhaes – @igormagalhaesrigormagalhaesr@gmail.com github.com/igorbenav

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clientai-0.5.0.tar.gz (85.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clientai-0.5.0-py3-none-any.whl (112.0 kB view details)

Uploaded Python 3

File details

Details for the file clientai-0.5.0.tar.gz.

File metadata

  • Download URL: clientai-0.5.0.tar.gz
  • Upload date:
  • Size: 85.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Darwin/24.1.0

File hashes

Hashes for clientai-0.5.0.tar.gz
Algorithm Hash digest
SHA256 bb445fcefc2cabbfe0715187d936670e5e1eb155d23c1f7834231ff92338753f
MD5 87ce3095a007b72c75f94c3050c98f95
BLAKE2b-256 998b8b299e3b8ae27122d07016fb4cbe8511a45e749dd959050f096943403f34

See more details on using hashes here.

File details

Details for the file clientai-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: clientai-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 112.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Darwin/24.1.0

File hashes

Hashes for clientai-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8b6abdbaddb5a312dd1ea725988c4394aac64c32459b6f078c441caf3299c8b0
MD5 785b748d6969550a2aa1391d7ec1e3b3
BLAKE2b-256 b36b55fe1f755b7373c189b4815f71d269cea3e841e749b5f7025c9ea9153f06

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page