Delfhos — AI agent SDK with typed connections and tool orchestration
Project description
Delfhos
Python SDK for building AI agents that use real tools — Gmail, SQL, Drive, Sheets, REST APIs, and your own functions — with safe, human-in-the-loop execution.
Full documentation at delfhos.com/docs
How it works
You describe a task in plain English. Delfhos:
- Picks the relevant tools from the ones you configured
- Writes Python code to accomplish the task
- Executes that code in a sandbox against your real services
- Retries automatically if something fails
You stay in control: restrict which actions each tool can take, and require approval before any write, send, or delete.
Install
pip install delfhos
API Key
Delfhos supports Gemini, OpenAI, and Anthropic models. Export the key for the provider you want to use:
export GOOGLE_API_KEY="..." # Gemini
export OPENAI_API_KEY="..." # OpenAI
export ANTHROPIC_API_KEY="..." # Claude
Try it instantly (no credentials needed)
The sandbox tools come pre-loaded with dummy data so you can run your first agent right now:
from delfhos import Agent
from delfhos.sandbox import MockEmail, MockDatabase
agent = Agent(
tools=[MockEmail(confirm=False), MockDatabase(confirm=False)],
llm="gemini-3.1-flash-lite-preview",
)
agent.run(
"Read my unread emails. If any mention a support ticket, "
"look it up in the database and summarise the customer name, "
"open tickets, and total order value."
)
agent.stop()
Or just run the included example:
python examples/hello_delfhos.py
Custom tools
Decorate any Python function with @tool and the agent can call it:
from delfhos import Agent, tool
@tool
def calculate_discount(price: float, pct: float) -> float:
"""Return price after applying a percentage discount."""
return price * (1 - pct / 100)
agent = Agent(tools=[calculate_discount], llm="gemini-3.1-flash-lite-preview")
agent.run("What is the price of a $120 item with a 15% discount?")
agent.stop()
Built-in tools
from delfhos import Gmail, SQL, Sheets, Drive, Calendar, Docs, WebSearch, APITool
gmail = Gmail(oauth_credentials="client_secrets.json", allow=["read", "send"], confirm=["send"])
db = SQL(url="postgresql://user:pass@host/db", allow=["schema", "query"])
drive = Drive(oauth_credentials="client_secrets.json", confirm=True)
agent = Agent(tools=[gmail, db, drive], llm="gemini-3.1-flash-lite-preview")
agent.run("Check unread emails and log any order mentions to the database.")
agent.stop()
allow — restrict which actions are available on the tool (["read", "send"], ["schema", "query"], …).
confirm — when human approval is required: True (all), False (none), or a list of specific actions.
REST API Integration (APITool)
Connect any REST API with an OpenAPI 3.x specification — no custom code needed.
from delfhos import Agent, APITool
# From a public OpenAPI spec
petstore = APITool(
spec="https://petstore3.swagger.io/api/v3/openapi.json",
allow=["list_pets", "get_pet_by_id"],
confirm=["create_pet", "delete_pet"],
)
# From a local spec with authentication
internal = APITool(
spec="./openapi.yaml",
base_url="https://api.internal.corp/v1",
headers={"Authorization": "Bearer sk_..."},
)
# Inspect available endpoints
print(petstore.inspect()) # Compact: endpoint names
print(petstore.inspect(verbose=True)) # Detailed: methods, paths, descriptions
agent = Agent(tools=[petstore, internal], llm="gemini-2.5-flash")
agent.run("List all pets and create a new one named 'Buddy'")
Features:
- Automatic endpoint compilation from OpenAPI specs (no LLM needed)
- Path, query, and request body parameters extracted and typed
headers=andparams=injected automatically — agent never sees credentials$refresolution for complex schemasallow=andconfirm=support for fine-grained access control- Caching: specs compiled once and cached to
~/delfhos/api_cache/
Interactive chat
from delfhos import Agent, Chat, Gmail
agent = Agent(
tools=[Gmail(oauth_credentials="client_secrets.json")],
llm="gemini-3.1-flash-lite-preview",
chat=Chat(summarizer_llm="gemini-3.1-flash-lite-preview"),
)
agent.run_chat() # starts a terminal session — type /help for commands
Memory & Long-term Context
Delfhos supports both session memory and persistent semantic memory with 100+ embedding models.
from delfhos import Agent, Chat, Memory
agent = Agent(
tools=[...],
llm="gemini-3.1-flash-lite-preview",
chat=Chat(keep=8, summarize=True, namespace="my_agent"), # short-term
memory=Memory(namespace="my_agent"), # long-term semantic
)
100+ Embedding Models: Automatic detection and compatibility for:
- Proprietary: OpenAI, Cohere, Anthropic, Google
- Open-source: Sentence-Transformers (MiniLM, all-MiniLM, all-mpnet, etc.)
- Specialized: BGE models (Alibaba), Jina, Nomic Embed, NV-Embed
- Local-first: Run models locally via Ollama or Hugging Face Transformers
Auto-detects model requirements:
trust_remote_codetoggles (for BGE, Jina, etc.)- Instruction/prefix tokens (e.g., Nomic's "search_document:" prefix)
- Model dimensions (inferred after loading)
See EMBEDDING_MODELS_GUIDE.md for the full compatibility matrix.
Response object
agent.run() returns a Response with the result, status, cost, and trace:
r = agent.run("How many users signed up this week?")
print(r.text) # agent's answer
print(r.status) # True if task succeeded
print(r.cost_usd) # cost in dollars (e.g. 0.0003)
print(r.duration_ms) # wall-clock time in milliseconds
Model support
Cloud providers: Gemini, OpenAI, or Anthropic
# Gemini
agent = Agent(tools=[...], llm="gemini-2.0-flash-lite")
agent = Agent(tools=[...], llm="gemini-2.0-flash")
# OpenAI
agent = Agent(tools=[...], llm="gpt-5")
agent = Agent(tools=[...], llm="gpt-4o")
# Anthropic
agent = Agent(tools=[...], llm="claude-4-5-haiku")
agent = Agent(tools=[...], llm="claude-4-6-sonnet")
Local & custom models: Use LLMConfig for any OpenAI-compatible endpoint
from delfhos import Agent, LLMConfig
# Local Ollama model
agent = Agent(
tools=[...],
llm=LLMConfig(model="llama3.2", base_url="http://localhost:11434/v1")
)
# Enterprise vLLM server
agent = Agent(
tools=[...],
llm=LLMConfig(
model="mistral-7b-instruct",
base_url="https://llm.corp.internal/v1",
api_key="internal-token"
)
)
# Any OpenAI-compatible provider (Groq, Together, Anyscale, etc.)
agent = Agent(
tools=[...],
llm=LLMConfig(
model="meta-llama/Llama-3-70b-chat-hf",
base_url="https://api.together.xyz/v1",
api_key="..."
)
)
Dual-LLM optimization: Use fast local + strong cloud model
agent = Agent(
tools=[...],
light_llm=LLMConfig(model="qwen2.5:7b", base_url="http://localhost:11434/v1"),
heavy_llm="gemini-2.5-flash", # or Claude, OpenAI, etc.
)
Context manager
The agent cleans up automatically when used as a context manager:
with Agent(tools=[...], llm="gemini-3.1-flash-lite-preview") as agent:
agent.run("Summarise last week's sales and email it to the team.")
For the full API reference and advanced guides see DOCS.md or delfhos.com/docs.
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file delfhos-0.6.7.tar.gz.
File metadata
- Download URL: delfhos-0.6.7.tar.gz
- Upload date:
- Size: 283.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
210bccd05d9179bf92169a8a2cc55b80ec8701e23d08b78b51b2efe7688452c3
|
|
| MD5 |
adacd9ae19d5f8d81b486d1046065443
|
|
| BLAKE2b-256 |
072e72c20dc1591df0c1bbceb8b9cd1692740e920c7309d3f90f36b4f6cb3fe5
|
File details
Details for the file delfhos-0.6.7-py3-none-any.whl.
File metadata
- Download URL: delfhos-0.6.7-py3-none-any.whl
- Upload date:
- Size: 306.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6887206d89ffcd2c452a01bdfe8c9cfd2d044aa3141f1d6d0385586f0cc02a8a
|
|
| MD5 |
6bab82f667f6664654b9997498f684e1
|
|
| BLAKE2b-256 |
952e868421800566ca51c914c1ab52319faf872c636088763cb089f7af31fd56
|