Skip to main content

The highest-level interface for various LLM APIs.

Project description

Chatterer

Simplified, Structured AI Assistant Framework

chatterer is a Python library designed as a type-safe LangChain wrapper for interacting with various language models (OpenAI, Anthropic, Gemini, Ollama, etc.). It supports structured outputs via Pydantic models, plain text responses, and asynchronous calls.

The structured reasoning in chatterer is inspired by the Atom-of-Thought pipeline.


Quick Install

pip install chatterer

Quickstart Example

Generate text quickly using OpenAI:

from chatterer import Chatterer

chat = Chatterer.openai("gpt-4o-mini")
response = chat.generate("What is the meaning of life?")
print(response)

Messages can be input as plain strings or structured lists:

response = chat.generate([{ "role": "user", "content": "What's 2+2?" }])
print(response)

Structured Output with Pydantic

from pydantic import BaseModel

class AnswerModel(BaseModel):
    question: str
    answer: str

response = chat.generate_pydantic(AnswerModel, "What's the capital of France?")
print(response.question, response.answer)

Async Example

import asyncio

async def main():
    response = await chat.agenerate("Explain async in Python briefly.")
    print(response)

asyncio.run(main())

Atom-of-Thought Pipeline (AoT)

AoTPipeline provides structured reasoning by:

  • Detecting question domains (general, math, coding, philosophy, multihop).
  • Decomposing questions recursively.
  • Generating direct, decomposition-based, and simplified answers.
  • Combining answers via ensemble.

AoT Usage Example

from chatterer import Chatterer
from chatterer.strategies import AoTStrategy, AoTPipeline

pipeline = AoTPipeline(chatterer=Chatterer.openai(), max_depth=2)
strategy = AoTStrategy(pipeline=pipeline)

question = "What would Newton discover if hit by an apple falling from 100 meters?"
answer = strategy.invoke(question)
print(answer)

Supported Models

  • OpenAI
  • Anthropic
  • Google Gemini
  • Ollama (local models)

Initialize models easily:

openai_chat = Chatterer.openai("gpt-4o-mini")
anthropic_chat = Chatterer.anthropic("claude-3-7-sonnet-20250219")
gemini_chat = Chatterer.google("gemini-2.0-flash")
ollama_chat = Chatterer.ollama("deepseek-r1:1.5b")

Advanced Features

  • Streaming responses
  • Async/Await support
  • Structured outputs with Pydantic models

Logging

Built-in logging for easy debugging:

import logging
logging.basicConfig(level=logging.DEBUG)

Contributing

Feel free to open an issue or pull request.


License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatterer-0.1.11.tar.gz (46.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatterer-0.1.11-py3-none-any.whl (51.6 kB view details)

Uploaded Python 3

File details

Details for the file chatterer-0.1.11.tar.gz.

File metadata

  • Download URL: chatterer-0.1.11.tar.gz
  • Upload date:
  • Size: 46.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.2

File hashes

Hashes for chatterer-0.1.11.tar.gz
Algorithm Hash digest
SHA256 8c2d8adaa3da407f1592713e56b1971e8bf1c91315d53ce67652bcc3fd2f0b08
MD5 b08a22114d621257585eda01f52a7c42
BLAKE2b-256 66b32f8d59290381e77f56984fb1fd997710befe930c124cbc4567a2aec52b95

See more details on using hashes here.

File details

Details for the file chatterer-0.1.11-py3-none-any.whl.

File metadata

  • Download URL: chatterer-0.1.11-py3-none-any.whl
  • Upload date:
  • Size: 51.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.2

File hashes

Hashes for chatterer-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 93a09a751bdf93c1805d306b4f9ddc3a086d4c73598cee1a076cf2e2dcd1bf2c
MD5 8ee4f68559f80fac439311d3834fe97e
BLAKE2b-256 1d421a5e91882879d5de948aa1bc42a45ce7173effb1e6ccbc72c4542c762827

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page