Skip to main content

Small OpenAI/Anthropic library to support chat templates, and function calls.

Project description

Autochat

image image Actions status

⚠️ Warning: Since agentic capabilities are evolving fast, expect the API to change.

A lightweight Python library to build AI agents with LLMs.

image

Key Features

  • 🤝 Support for multiple LLM providers (OpenAI and Anthropic)
  • 🐍 Transform python function or class into a tool
  • 🔁 Run conversation as a generator.
  • 🙈 Handle caching by default (anthropic model claude-3-7-sonnet-latest)
  • ✨ And more features including:
    • Simple template system
    • Easy function and tool integration
    • Flexible instruction and example management
    • Support for images
    • Support for MCP servers

Example (search capability)

The library supports function call, handling the back-and-forth between the system and the assistant.

from autochat import Autochat

def search_wikipedia(title: str):
    """Search wikipedia for information"""
    import requests
    from bs4 import BeautifulSoup

    response = requests.get(f"https://en.wikipedia.org/w/index.php?search={title}&title=Special%3ASearch")
    soup = BeautifulSoup(response.text, 'html.parser')
    body_content = soup.find('div', {'id': 'bodyContent'})
    return body_content.text.strip()

classifier_agent = Autochat()
classifier_agent.add_function(search_wikipedia)

text = "Since when is the lastest iphone available?"
for message in classifier_agent.run_conversation(text):
    print(message.to_markdown())

# > ## user
# > Since when is the lastest iphone available?
# > ## assistant
# > search_wikipedia(title=iPhone)
# > ## function
# > Result: (html content)
# > ## assistant
# > The latest flagship iPhone models, the iPhone 16 and 16 Plus, along with the higher-end iPhone 16 Pro and 16 Pro Max, were available as of January 1, 2024.

Quick Start

Initialize with OpenAI (default)

agent = Autochat(instruction="You are a helpful assistant")

Simple conversation

response = agent.ask("What is the capital of France?")
print(response.content)

Using Anthropic's Claude

agent = Autochat(provider="anthropic")
response = agent.ask("Explain quantum computing in simple terms")
print(response.content)

Run conversation as a generator

for message in agent.run_conversation("Explain quantum computing in simple terms"):
    print(message.to_markdown())

Async Interface

Autochat provides async versions of its core methods for use in async applications:

# Async version of ask
response = await agent.ask_async("What is the capital of France?")
print(response.content)

# Async version of run_conversation
async for message in agent.run_conversation_async("Explain quantum computing"):
    print(message.to_markdown())

# Async function calls are also supported
async def async_calculator(a: int, b: int) -> int:
    await asyncio.sleep(0.1)  # Some async work
    return a + b

agent.add_function(async_calculator)
async for message in agent.run_conversation_async("What is 5 + 3?"):
    print(message.to_markdown())

Add a function call as python function

def multiply(a: int, b: int) -> int:
    return a * b

agent = Autochat()
agent.add_function(multiply)
text = "What is 343354 * 13243343214"
for message in agent.run_conversation(text):
    print(message.to_markdown())

Add a Class as a tool

from autochat import Autochat

class Calculator:
    def add(self, a: int, b: int) -> int:
        """Add two numbers"""
        return a + b

    def multiply(self, a: int, b: int) -> int:
        """Multiply two numbers"""
        return a * b

calculator = Calculator()

agent = Autochat()
agent.add_tool(calculator)
for message in agent.run_conversation(
    "What make 343354 * 13243343214"
):
    print(message)

Add a MCP server

Experimental feature. Check out the tests/mcp_clients for more information.

Installation

To install the package, you can use pip:

pip install 'autochat[all]'

Image support

from autochat import Autochat, Message

from PIL import Image

agent = Autochat()

image = Image.open("examples/image.jpg")
message = Message(role="user", content="describe the image", image=image)
response = agent.ask(message)
print(response.to_markdown())

Template System

We provide a simple template system for defining the behavior of the chatbot, using markdown-like syntax.

## system
You are a parrot

## user
Hi my name is Bob

## assistant
Hi my name is Bob, hi my name is Bob!

## user
Can you tell me my name?

## assistant
Your name is Bob, your name is Bob!

You can then load the template file using the from_template method:

parrotGPT = Autochat.from_template("./parrot_template.txt")

The template system also supports function calls. Check out the examples/demo_label.py for a complete example.

Environment Variables

The AUTOCHAT_MODEL environment variable specifies the model to use. If not set, it defaults to "gpt-4o" for openai and "claude-3-7-sonnet-latest" for anthropic.

We recommend to use Anthropic / claude-3-7-sonnet-latest for agentic behavior.

export AUTOCHAT_MODEL="gpt-4o"
export OPENAI_API_KEY=<your-key>

or with anthropic

export AUTOCHAT_MODEL="claude-3-7-sonnet-latest"
export ANTHROPIC_API_KEY=<your-key>

Use AUTOCHAT_HOST to use alternative provider (openai, anthropic, openpipe, llama_cpp, ...)

Support

If you encounter any issues or have questions, please file an issue on the GitHub project page.

License

This project is licensed under the terms of the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autochat-0.18.1.tar.gz (31.4 kB view details)

Uploaded Source

Built Distribution

autochat-0.18.1-py3-none-any.whl (27.7 kB view details)

Uploaded Python 3

File details

Details for the file autochat-0.18.1.tar.gz.

File metadata

  • Download URL: autochat-0.18.1.tar.gz
  • Upload date:
  • Size: 31.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for autochat-0.18.1.tar.gz
Algorithm Hash digest
SHA256 bd50f5adf2f7f2a0ca66eb029ed301864556dd03a11a6b58616bbc09f1f2c499
MD5 254affdd736fa9c82431340fabe6355e
BLAKE2b-256 8d35fa639072124b22c9bc5c161eba3f9b44dffcda9ed3cb16d39f367da651de

See more details on using hashes here.

File details

Details for the file autochat-0.18.1-py3-none-any.whl.

File metadata

  • Download URL: autochat-0.18.1-py3-none-any.whl
  • Upload date:
  • Size: 27.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for autochat-0.18.1-py3-none-any.whl
Algorithm Hash digest
SHA256 559ecdcc9d1e79dbffe5fedc4ac250be8228a96d22df9f8464bf7bf855032052
MD5 f0e35f5d499debd5b54c4e81e83eee55
BLAKE2b-256 bc7cc3c94566cee25ffc0f7af86532df5ae6b95bed512094827ab9fa375eea2c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page