The highest-level interface for various LLM APIs.
Project description
Chatterer
Simplified, Structured AI Assistant Framework
chatterer is a Python library designed as a type-safe LangChain wrapper for interacting with various language models (OpenAI, Anthropic, Gemini, Ollama, etc.). It supports structured outputs via Pydantic models, plain text responses, and asynchronous calls.
The structured reasoning in chatterer is inspired by the Atom-of-Thought pipeline.
Quick Install
pip install chatterer
Quickstart Example
Generate text quickly using OpenAI:
from chatterer import Chatterer
chat = Chatterer.openai("gpt-4o-mini")
response = chat.generate("What is the meaning of life?")
print(response)
Messages can be input as plain strings or structured lists:
response = chat.generate([{ "role": "user", "content": "What's 2+2?" }])
print(response)
Structured Output with Pydantic
from pydantic import BaseModel
class AnswerModel(BaseModel):
question: str
answer: str
response = chat.generate_pydantic(AnswerModel, "What's the capital of France?")
print(response.question, response.answer)
Async Example
import asyncio
async def main():
response = await chat.agenerate("Explain async in Python briefly.")
print(response)
asyncio.run(main())
Atom-of-Thought Pipeline (AoT)
AoTPipeline provides structured reasoning by:
- Detecting question domains (general, math, coding, philosophy, multihop).
- Decomposing questions recursively.
- Generating direct, decomposition-based, and simplified answers.
- Combining answers via ensemble.
AoT Usage Example
from chatterer import Chatterer
from chatterer.strategies import AoTStrategy, AoTPipeline
pipeline = AoTPipeline(chatterer=Chatterer.openai(), max_depth=2)
strategy = AoTStrategy(pipeline=pipeline)
question = "What would Newton discover if hit by an apple falling from 100 meters?"
answer = strategy.invoke(question)
print(answer)
Supported Models
- OpenAI
- Anthropic
- Google Gemini
- Ollama (local models)
Initialize models easily:
openai_chat = Chatterer.openai("gpt-4o-mini")
anthropic_chat = Chatterer.anthropic("claude-3-7-sonnet-20250219")
gemini_chat = Chatterer.google("gemini-2.0-flash")
ollama_chat = Chatterer.ollama("deepseek-r1:1.5b")
Advanced Features
- Streaming responses
- Async/Await support
- Structured outputs with Pydantic models
Logging
Built-in logging for easy debugging:
import logging
logging.basicConfig(level=logging.DEBUG)
Contributing
Feel free to open an issue or pull request.
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatterer-0.1.12.tar.gz.
File metadata
- Download URL: chatterer-0.1.12.tar.gz
- Upload date:
- Size: 46.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8879f4c518d69767052887ccf003a1ba9215272b58c69e1224654961cde29e3a
|
|
| MD5 |
e84e4612315a5409b3281049121c6d8c
|
|
| BLAKE2b-256 |
07a7f7015e847980214bffdbbf65217ac901016770ca84693a74926a4af001b2
|
File details
Details for the file chatterer-0.1.12-py3-none-any.whl.
File metadata
- Download URL: chatterer-0.1.12-py3-none-any.whl
- Upload date:
- Size: 51.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
48fa851c89a9591101e11b45330d4a18bc9b22cc6b63902228418b1404a9075b
|
|
| MD5 |
2a17557add8c424cbab587f4978a0de8
|
|
| BLAKE2b-256 |
ec759f48bbf8eb6ced31235f23d6c4839096704b8e443765d124d52f3a1140c2
|