Production agent framework on LangChain/LangGraph: nine execution patterns, persistent memory, skills, feedback, multi-level HITL, MCP, AGP protocol, runtime bridge, and observability hooks.
Project description
agloom
Build agents that route themselves.
One familiar API — classification, memory, streaming, guardrails, and learning included.
Nine execution patterns. Auto-selected per task. Skills improve over time.
Documentation · Quick Start · PyPI · Examples · Issues
Start here
agloom is a Python framework for production-minded agents on LangChain / LangGraph. You describe the model and tools; agloom picks how to run the task (single-shot, ReAct, supervisor-style delegation, pipelines, and more), tracks steps and tokens, and can learn reusable skills from what worked.
If you already use LangChain’s agent APIs, think of create_agent as your main entrypoint — with orchestration, memory, streaming, and safety knobs in one place.
Install
pip install agloom
# optional extras, e.g. Groq:
pip install agloom[groq]
Your first agent
import asyncio
from langchain_groq import ChatGroq
from agloom import create_agent
async def main():
llm = ChatGroq(model="meta-llama/llama-4-scout-17b-16e-instruct")
agent = await create_agent(model=llm, name="my-agent")
result = await agent.ainvoke("What causes auroras?")
print(result.output)
asyncio.run(main())
create_agent is async (use await). From synchronous code, use create_agent_sync.
Next steps: Why agloom? · Patterns explained · All parameters
What you get (in plain language)
| You want to… | agloom helps by… |
|---|---|
| Ship faster | Picking a strategy per query instead of hand-writing routers and graphs |
| Keep context | Session memory by default; optional long-term memory and skills |
| Show progress | Token streaming plus structured events for “thinking” / tool UIs |
| Stay safe | Human-in-the-loop levels, timeouts, retries, rate limits — configurable |
| Improve over time | Skill library and feedback hooks so behavior compounds |
For the full feature tour, see What you get in the docs — the README stays short on purpose.
agloom CLI & web workspace
- Terminal: the agloom CLI (npm
agloom-cli, repoagloom_cli/) is the terminal client — React-based UI. From that folder:npm install→npm run build→npm start. It talks toagloom-runtimeover AGP (stdio by default). CLI quick start - Browser:
agloom_web/is the Vite workspace for sessions and observability — same idea, run commands inside that folder.
PyPI’s agloom package includes the library and agloom-runtime. The agloom command prints a short pointer to the agloom CLI (repo folder agloom_cli/) for backwards compatibility.
Learn more (documentation hub)
| Guide | What it’s for |
|---|---|
| Quick Start | Smallest path to a running agent |
| Execution patterns | How routing works (conceptual + diagrams) |
| Streaming & events | Responsive UI patterns |
| Production | Deploying, testing, operating |
| Errors & fixes | When something goes wrong |
Requirements
- Python 3.12.x (see
pyproject.tomlon GitHub for the exact pin) - Node.js ≥ 24.15.0 — only if you hack on
agloom_cli/oragloom_web/ - An LLM API key (Groq, OpenAI, NVIDIA, Hugging Face, or another LangChain-compatible provider)
Contributing & license
Contributions welcome — see CONTRIBUTING.md.
Licensed under Apache 2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agloom-0.1.78.tar.gz.
File metadata
- Download URL: agloom-0.1.78.tar.gz
- Upload date:
- Size: 443.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9d004d57f474dfcf20773a48a911fd351a1b755c414f8b798cdac76349cc2439
|
|
| MD5 |
e7a288f683bcb785e17fccbe8680bca3
|
|
| BLAKE2b-256 |
79c92e8e6544b63b699e35971de6014e3141b3f07e8fd8a5321e85fea97ee733
|
File details
Details for the file agloom-0.1.78-py3-none-any.whl.
File metadata
- Download URL: agloom-0.1.78-py3-none-any.whl
- Upload date:
- Size: 350.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
39658fe4a94447f49915ce27716c3b3b28bb3e5c66ad431e0c02c8b84abe6a88
|
|
| MD5 |
fd0801212ceb6b7b88c80c1934f04b4d
|
|
| BLAKE2b-256 |
4de477de74ac3c68389b5f3ad1836a9deb77b69d3db3d08c863b5faa028066d9
|