Open-source voice AI SDK — connect any AI agent to real phone calls in 4 lines of code
Project description
Patter Python SDK
Connect AI agents to phone numbers in four lines of code
Quickstart • Features • Configuration • Voice Modes • API Reference • Contributing
Patter is the open-source SDK that gives your AI agent a phone number. Point it at any function that returns a string, and Patter handles the rest: telephony, speech-to-text, text-to-speech, and real-time audio streaming. You build the agent — we connect it to the phone.
Quickstart
pip install getpatter
Set the env vars your carrier and engine need:
export TWILIO_ACCOUNT_SID=ACxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
export TWILIO_AUTH_TOKEN=your_auth_token
export OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx
Four lines of Python:
from getpatter import Patter, Twilio, OpenAIRealtime
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(engine=OpenAIRealtime(), system_prompt="You are a friendly receptionist for Acme Corp.", first_message="Hello! How can I help?")
await phone.serve(agent, tunnel=True)
tunnel=True spawns a Cloudflare tunnel and points your Twilio number at it. In production, pass webhook_url="api.prod.example.com" to the constructor instead.
Features
| Feature | Method | Example |
|---|---|---|
| Inbound calls | phone.serve(agent) |
Answer calls as an AI |
| Outbound calls + AMD | phone.call(to, machine_detection=True) |
Place calls with voicemail detection |
| Tool calling | agent(tools=[Tool(...)]) |
Agent calls external APIs mid-conversation |
| Custom STT + TTS | agent(stt=DeepgramSTT(), tts=ElevenLabsTTS()) |
Bring your own voice providers |
| Dynamic variables | agent(variables={...}) |
Personalize prompts per caller |
| Pluggable LLM | agent(llm=AnthropicLLM()) |
5 built-in providers: OpenAI, Anthropic, Groq, Cerebras, Google |
| Custom LLM (any model) | serve(on_message=handler) |
Route to anything — local llama.cpp, internal gateways, etc. |
| Call recording | serve(recording=True) |
Record all calls |
| Call transfer | transfer_call (auto-injected) |
Transfer to a human |
| Voicemail drop | call(voicemail_message="...") |
Play message on voicemail |
Configuration
Environment variables
Every provider reads its credentials from the environment by default. Pass api_key="..." to any constructor to override.
| Variable | Used by |
|---|---|
TWILIO_ACCOUNT_SID, TWILIO_AUTH_TOKEN |
Twilio() carrier |
TELNYX_API_KEY, TELNYX_CONNECTION_ID, TELNYX_PUBLIC_KEY (optional) |
Telnyx() carrier |
OPENAI_API_KEY |
OpenAIRealtime, getpatter.stt.whisper.STT, getpatter.tts.openai.TTS |
ELEVENLABS_API_KEY, ELEVENLABS_AGENT_ID |
ElevenLabsConvAI, getpatter.tts.elevenlabs.TTS |
DEEPGRAM_API_KEY |
getpatter.stt.deepgram.STT |
CARTESIA_API_KEY |
getpatter.stt.cartesia.STT, getpatter.tts.cartesia.TTS |
RIME_API_KEY |
getpatter.tts.rime.TTS |
LMNT_API_KEY |
getpatter.tts.lmnt.TTS |
SONIOX_API_KEY |
getpatter.stt.soniox.STT |
SPEECHMATICS_API_KEY |
getpatter.stt.speechmatics.STT |
ASSEMBLYAI_API_KEY |
getpatter.stt.assemblyai.STT |
ANTHROPIC_API_KEY |
AnthropicLLM / getpatter.llm.anthropic.LLM |
GROQ_API_KEY |
GroqLLM / getpatter.llm.groq.LLM |
CEREBRAS_API_KEY |
CerebrasLLM / getpatter.llm.cerebras.LLM |
GEMINI_API_KEY (or GOOGLE_API_KEY) |
GoogleLLM / getpatter.llm.google.LLM |
cp .env.example .env
# Edit .env with your API keys
Telnyx: Telnyx is a fully supported telephony provider alternative to Twilio. Both carriers receive equal support for DTMF, transfer, recording, and metrics.
Voice Modes
| Mode | Latency | Quality | Best For |
|---|---|---|---|
| OpenAI Realtime | Lowest | High | Fluid, low-latency conversations |
| Pipeline (STT + LLM + TTS) | Low | High | Independent control over STT and TTS |
| ElevenLabs ConvAI | Low | High | ElevenLabs-managed conversation flow |
API Reference
Patter constructor
Patter(
carrier: Twilio | Telnyx,
phone_number: str,
webhook_url: str = "", # Public hostname (no scheme). Mutually exclusive with tunnel=...
tunnel: CloudflareTunnel | Static | Ngrok | None = None,
pricing: dict | None = None,
)
| Parameter | Type | Description |
|---|---|---|
carrier |
Twilio / Telnyx |
Carrier instance. Reads env vars by default. |
phone_number |
str |
Your phone number in E.164 format. |
webhook_url |
str |
Public hostname your local server is reachable on. Use instead of tunnel=. |
tunnel |
instance | CloudflareTunnel(), Static(hostname=...), or Ngrok(). |
phone.agent()
phone.agent(
system_prompt: str,
engine: OpenAIRealtime | ElevenLabsConvAI | None = None, # default OpenAIRealtime()
stt: STTProvider | None = None, # e.g. DeepgramSTT()
tts: TTSProvider | None = None, # e.g. ElevenLabsTTS()
voice: str = "alloy",
model: str = "gpt-4o-mini-realtime-preview",
language: str = "en",
first_message: str = "",
tools: list[Tool] | None = None,
guardrails: list[Guardrail] | None = None,
variables: dict | None = None,
...,
)
Pass engine= for end-to-end mode, stt= + tts= for pipeline mode. Both arguments may take plain adapter instances (e.g. DeepgramSTT()) that read their API key from the environment.
phone.serve()
await phone.serve(
agent: Agent,
port: int = 8000,
tunnel: bool = False, # shortcut for Patter(tunnel=CloudflareTunnel())
dashboard: bool = True,
recording: bool = False,
on_call_start: Callable | None = None,
on_call_end: Callable | None = None,
on_transcript: Callable | None = None,
on_message: Callable | str | None = None,
voicemail_message: str = "",
dashboard_token: str = "",
)
phone.call()
await phone.call(
to: str,
agent: Agent | None = None, # required in local mode
from_number: str = "",
first_message: str = "",
machine_detection: bool = False,
voicemail_message: str = "",
ring_timeout: int | None = None,
)
STT / TTS catalog
Flat re-exports (short form):
from getpatter import (
Twilio, Telnyx,
OpenAIRealtime, ElevenLabsConvAI,
# STT / TTS classes live in namespaced modules — see below.
)
Namespaced imports (one module per provider):
from getpatter.stt import deepgram, whisper, cartesia, soniox, speechmatics, assemblyai
from getpatter.tts import elevenlabs, openai as openai_tts, cartesia as cartesia_tts, rime, lmnt
stt = deepgram.STT() # reads DEEPGRAM_API_KEY
tts = elevenlabs.TTS(voice="rachel") # reads ELEVENLABS_API_KEY
Examples
Inbound calls — default engine
import asyncio
from getpatter import Patter, Twilio, OpenAIRealtime
async def main() -> None:
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(
engine=OpenAIRealtime(),
system_prompt="You are a helpful customer service agent.",
first_message="Hello! How can I help?",
)
await phone.serve(
agent,
tunnel=True,
on_call_start=lambda data: print(f"Call from {data['caller']}"),
on_call_end=lambda data: print("Call ended"),
)
asyncio.run(main())
Custom voice — Deepgram STT + ElevenLabs TTS
from getpatter import Patter, Twilio
from getpatter.stt import deepgram
from getpatter.tts import elevenlabs
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(
stt=deepgram.STT(), # reads DEEPGRAM_API_KEY
tts=elevenlabs.TTS(voice="rachel"), # reads ELEVENLABS_API_KEY
system_prompt="You are a helpful voice assistant.",
)
await phone.serve(agent, tunnel=True)
Pipeline mode — pick STT, LLM, TTS independently
from getpatter import Patter, Twilio, DeepgramSTT, AnthropicLLM, ElevenLabsTTS
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(
stt=DeepgramSTT(), # reads DEEPGRAM_API_KEY
llm=AnthropicLLM(), # reads ANTHROPIC_API_KEY
tts=ElevenLabsTTS(voice_id="rachel"), # reads ELEVENLABS_API_KEY
system_prompt="You are a helpful voice assistant.",
)
await phone.serve(agent, tunnel=True)
Available LLM providers: OpenAILLM, AnthropicLLM, GroqLLM, CerebrasLLM, GoogleLLM. Tool calling works across all five. For fully custom logic, drop llm= and pass an on_message callback to serve() instead.
Tool calling
from getpatter import Patter, Twilio, OpenAIRealtime, Tool, tool
@tool
async def check_availability(date: str) -> dict:
"""Check appointment availability for a given ISO date."""
return {"available": True}
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(
engine=OpenAIRealtime(),
system_prompt="You are a booking assistant.",
tools=[check_availability],
)
await phone.serve(agent, tunnel=True)
Outbound calls
from getpatter import Patter, Twilio, OpenAIRealtime
phone = Patter(carrier=Twilio(), phone_number="+15550001234")
agent = phone.agent(
engine=OpenAIRealtime(),
system_prompt="You are making reminder calls.",
first_message="Hi, this is a reminder from Acme Corp.",
)
await phone.serve(agent, tunnel=True)
await phone.call(to="+14155551234", agent=agent)
Dynamic variables
agent = phone.agent(
engine=OpenAIRealtime(),
system_prompt="You are helping {customer_name}, account #{account_id}.",
first_message="Hi {customer_name}! How can I help you today?",
variables={"customer_name": "Jane", "account_id": "A-789"},
)
Contributing
Pull requests are welcome.
cd sdk-py && pip install -e ".[dev]" && pytest tests/ -v
Please open an issue before submitting large changes so we can discuss the approach first.
License
MIT — see LICENSE.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file getpatter-0.5.2.tar.gz.
File metadata
- Download URL: getpatter-0.5.2.tar.gz
- Upload date:
- Size: 4.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aecb3ca3b2034ca937e93d402e867ea5f3e7c8ba2aa5e3ccb141fdccd003e6c7
|
|
| MD5 |
ddac5812a5de7749b5888abc649507ef
|
|
| BLAKE2b-256 |
b2c81281bb9c52d0916680bc8337c357dc4c7fcedea0ac7dd8661b22e15719be
|
Provenance
The following attestation bundles were made for getpatter-0.5.2.tar.gz:
Publisher:
release.yml on PatterAI/Patter
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
getpatter-0.5.2.tar.gz -
Subject digest:
aecb3ca3b2034ca937e93d402e867ea5f3e7c8ba2aa5e3ccb141fdccd003e6c7 - Sigstore transparency entry: 1362581669
- Sigstore integration time:
-
Permalink:
PatterAI/Patter@81358aee24ebdf29974df0de8332b2e2aa40e979 -
Branch / Tag:
refs/tags/v0.5.2 - Owner: https://github.com/PatterAI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@81358aee24ebdf29974df0de8332b2e2aa40e979 -
Trigger Event:
push
-
Statement type:
File details
Details for the file getpatter-0.5.2-py3-none-any.whl.
File metadata
- Download URL: getpatter-0.5.2-py3-none-any.whl
- Upload date:
- Size: 4.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e51f416556020a2073c6c0e092ac6ec7b298af325714bff6d9bceba15737803f
|
|
| MD5 |
ae30bf50984d434730dcdbbd8226d5c3
|
|
| BLAKE2b-256 |
1b7f6fadd9a46766aa399a03c33fc6e3e005a2201cac67d226e86dcfcbdc30f6
|
Provenance
The following attestation bundles were made for getpatter-0.5.2-py3-none-any.whl:
Publisher:
release.yml on PatterAI/Patter
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
getpatter-0.5.2-py3-none-any.whl -
Subject digest:
e51f416556020a2073c6c0e092ac6ec7b298af325714bff6d9bceba15737803f - Sigstore transparency entry: 1362581739
- Sigstore integration time:
-
Permalink:
PatterAI/Patter@81358aee24ebdf29974df0de8332b2e2aa40e979 -
Branch / Tag:
refs/tags/v0.5.2 - Owner: https://github.com/PatterAI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@81358aee24ebdf29974df0de8332b2e2aa40e979 -
Trigger Event:
push
-
Statement type: