Alloy (Python): Python for logic. English for intelligence.
Project description
Alloy (Python): Python for logic. English for intelligence.
License: MIT
This repository contains an early scaffold of the Alloy library per alloy-spec-v1.md.
Quick start
- Install (all providers):
pip install 'alloy-ai[providers]' - Or minimal (OpenAI only):
pip install alloy-ai - Create
.envwithOPENAI_API_KEY=... - Use the API:
from dataclasses import dataclass
from dotenv import load_dotenv
from alloy import command, ask, configure
load_dotenv()
# Optional: configure() — default model is `gpt-5-mini` if omitted
# configure(model="gpt-5-mini", temperature=0.7)
@command(output=float)
def ExtractPrice(text: str) -> str:
"""Extract price from text."""
return f"Extract the price (number only) from: {text}"
print(ExtractPrice("This item costs $49.99."))
print(ask("Say hi"))
Notes
- OpenAI backend is implemented for sync/async/streaming.
- Streaming with tools is not yet supported.
- For structured outputs, Alloy attempts to use OpenAI structured responses (JSON schema). If unavailable, the model may still return JSON, which Alloy parses best-effort.
- Configuration defaults: Alloy uses
model=gpt-5-miniifconfigure(...)is not called. You can also set process environment variables instead of a.envfile:ALLOY_MODEL,ALLOY_TEMPERATURE,ALLOY_MAX_TOKENS,ALLOY_SYSTEM/ALLOY_DEFAULT_SYSTEM,ALLOY_RETRY.- Example:
export ALLOY_MODEL=gpt-4othen run your script.
Examples
- See
examples/basic_usage.pyandexamples/tools_demo.py(tools + contracts).
Offline mode (dev only)
- To run examples without network/API keys, set
ALLOY_BACKEND=fake. - Example:
ALLOY_BACKEND=fake python examples/basic_usage.py
Config precedence
- Defaults:
model=gpt-5-mini,max_tool_turns=2. - Process env (ALLOY_*) overrides defaults.
- Context/use_config and
configure(...)override env/defaults. - Per-call overrides (e.g.,
ask(..., model=...)) override everything above.
Make targets
make setup— install dev deps and package in editable mode.make test,make lint,make typecheck— CI-like checks.make examples— runsexamples/basic_usage.pyandexamples/tools_demo.py.- Tip:
ALLOY_BACKEND=fake make examplesto run offline.
- Tip:
Troubleshooting
- API key: Ensure
OPENAI_API_KEYis set (process env or.env). - Model choice: Prefer
gpt-5-minifor fastest latency; switch viaconfigure(model=...)orALLOY_MODEL. - Timeouts/slow runs: Reduce
max_tokens, lowertemperature, prefer smaller models, and cap tool loops. - Tool loops: Alloy caps tool iterations by default (
max_tool_turns=2). Adjust viaconfigure(max_tool_turns=1)or envALLOY_MAX_TOOL_TURNS. - Rate limits (429): Shorten prompts/outputs, add retries with backoff, or use lower-throughput settings.
Integration tests
- OpenAI: Set
OPENAI_API_KEY(and optionallyALLOY_IT_MODEL, defaultgpt-5-mini). Runpytest -q— OpenAI integration tests auto-enable. - Anthropic: Set
ANTHROPIC_API_KEYandALLOY_IT_MODEL=claude-3.5-sonnet(or another Claude). Runpytest -q— Anthropic integration tests auto-enable. - Gemini: Set
GOOGLE_API_KEYandALLOY_IT_MODEL=gemini-1.5-pro(or another Gemini). Runpytest -q— Gemini integration tests auto-enable.- SDK note: Gemini support uses
google-genai(GA).
- SDK note: Gemini support uses
How to run locally
- Install providers bundle:
pip install 'alloy-ai[providers]' - Create
.envwithOPENAI_API_KEY=... - Option A (no install):
python examples/basic_usage.pypython examples/tools_demo.py- (examples add
src/tosys.pathfor you)
- Option B (editable install):
pip install -e '.[providers]'- Then run from anywhere.
.env example
OPENAI_API_KEY=sk-...
Support matrix (v1)
- OpenAI (GPT-4/5 and o-series): completions, typed commands, ask, streaming (no tools in stream), tool-calling, structured JSON for object schemas, tool-loop cap.
- Anthropic (Claude 3/3.5): completions and tool-calling loop (no streaming yet).
- Google (Gemini 1.5): basic completions (no tools/streaming in scaffold). Uses
google-genaiby default;google-generativeaisupported via legacy extra. - Ollama (local): basic completions via
model="ollama:<name>"(no tools/streaming in scaffold). - ReAct fallback: not implemented yet (planned for local models/LLMs without native tools).
Install options
- Base:
pip install alloy-ai(includes OpenAI + python-dotenv). - All providers:
pip install 'alloy-ai[providers]'(OpenAI, Anthropic, Gemini viagoogle-genai, Ollama). - Specific extras:
pip install 'alloy-ai[anthropic]',pip install 'alloy-ai[gemini]',pip install 'alloy-ai[ollama]'. Documentation - Full docs: https://openai.github.io/alloy-py/
Releases
- Changelog: CHANGELOG.md
- Publishing: Create a tag like
v0.1.1on main — CI builds and uploads to PyPI (needs Trusted Publishing foralloy-aior a configured token).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file alloy_ai-0.1.1.tar.gz.
File metadata
- Download URL: alloy_ai-0.1.1.tar.gz
- Upload date:
- Size: 19.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4158a276600743ff7344ecd85fc2b7800a59d759e04d6ca25ceb6586c203bf43
|
|
| MD5 |
03b782a57198c0f34446a61f0e085047
|
|
| BLAKE2b-256 |
8a725594315a8088ceb4d74fdfaa8f3c61bbc126b8f67a3717b3ee602a56c67e
|
Provenance
The following attestation bundles were made for alloy_ai-0.1.1.tar.gz:
Publisher:
release.yml on lydakis/alloy-py
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
alloy_ai-0.1.1.tar.gz -
Subject digest:
4158a276600743ff7344ecd85fc2b7800a59d759e04d6ca25ceb6586c203bf43 - Sigstore transparency entry: 378228932
- Sigstore integration time:
-
Permalink:
lydakis/alloy-py@a01a223aa04461e36583bdf22ab41e4ebab3e2f6 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/lydakis
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@a01a223aa04461e36583bdf22ab41e4ebab3e2f6 -
Trigger Event:
push
-
Statement type:
File details
Details for the file alloy_ai-0.1.1-py3-none-any.whl.
File metadata
- Download URL: alloy_ai-0.1.1-py3-none-any.whl
- Upload date:
- Size: 20.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9db2f5f3e71bd007376b158a836ede4801b3b8fd79f66d07dcf343650f3b75ae
|
|
| MD5 |
330ec2460802cdb195967f4183cb2082
|
|
| BLAKE2b-256 |
b8a004cf927582e95021295e93cff8a786c4a4c3bc9c251d0a80f95813981276
|
Provenance
The following attestation bundles were made for alloy_ai-0.1.1-py3-none-any.whl:
Publisher:
release.yml on lydakis/alloy-py
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
alloy_ai-0.1.1-py3-none-any.whl -
Subject digest:
9db2f5f3e71bd007376b158a836ede4801b3b8fd79f66d07dcf343650f3b75ae - Sigstore transparency entry: 378228949
- Sigstore integration time:
-
Permalink:
lydakis/alloy-py@a01a223aa04461e36583bdf22ab41e4ebab3e2f6 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/lydakis
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@a01a223aa04461e36583bdf22ab41e4ebab3e2f6 -
Trigger Event:
push
-
Statement type: