Skip to main content

Minimal Python framework for composing agents, tools, and multi-agent workflows

Project description

Quark Agents

source lines docs

Experimental. An ongoing exploration into the simplest possible agentic framework — use it to learn, hack, and break agentic things.

Minimal Python framework for composing agents, tools, and multi-agent workflows. Define agents with a system prompt and tools, then compose them using the >> operator. Provider-agnostic via litellm.

Despite being a single ~300-line file, you get:

  • OpenTelemetry tracing
  • 100+ model providers via litellm
  • Multi-agent workflows with >>
  • Parallel fan-out and tool execution
  • Streaming
  • Conversation memory

Install

pip install quark-agents

# From source
git clone https://github.com/awslabs/quark-agents
cd quark-agents
pip install .

# With OpenTelemetry support
pip install "quark-agents[otel]"

Install with uv

git clone https://github.com/awslabs/quark-agents
cd quark-agents
uv venv
source .venv/bin/activate

# Core + dev dependencies (pytest, mkdocs)
uv pip install ".[dev]"

# With OpenTelemetry
uv pip install ".[dev,otel]"

# With AWS Bedrock support
uv pip install ".[dev,bedrock]"

# All extras
uv pip install ".[dev,otel,bedrock]"

Note: Editable installs (-e) require setuptools>=75. If you see ModuleNotFoundError: No module named 'setuptools.backends', make sure pyproject.toml has requires = ["setuptools>=75"] under [build-system], or use a non-editable install (uv pip install ".[dev]" without -e).

Usage

Single agent

from quark import Agent

agent = Agent(
    system="You are a helpful assistant.",
    model="gpt-5.4",  # or any litellm-supported model
    name="assistant",
)

print(agent.run("What is the capital of France?"))

Agent with tools

def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    system="You are a weather assistant.",
    model="gpt-5.4",
    tools={"get_weather": get_weather},
)

print(agent.run("What's the weather in Paris?"))

Pipelines with >>

Chain agents and plain functions using >>. Output of each step becomes input to the next.

from quark import Agent

def fetch_article(url: str) -> str:
    """Fetch article content from a URL."""
    return "..."  # your fetch logic

summarizer = Agent(system="Summarize the article in 3 bullet points.", name="summarizer")
critic     = Agent(system="List 2 weaknesses in this summary.", name="critic")
editor     = Agent(system="Write a final improved summary given the feedback.", name="editor")

pipeline = fetch_article >> summarizer >> critic >> editor
result = pipeline.run("https://example.com/article")

Parallel fan-out with lists

Wrap steps in a list to run them in parallel. Their outputs are combined and passed to the next step.

pipeline = fetch_article >> summarizer >> [critic, fact_checker] >> editor
result = pipeline.run("https://example.com/article")

Composing workflows

research = fetch_article >> summarizer
review   = [critic, fact_checker] >> editor

pipeline = research >> review
result = pipeline.run("https://example.com/article")

Streaming

for chunk in agent.stream("Tell me a story."):
    print(chunk, end="", flush=True)

Provider-agnostic

# OpenAI
agent = Agent(model="gpt-5.4")

# Anthropic
agent = Agent(model="claude-opus-4-6")

# AWS Bedrock
agent = Agent(model="bedrock/anthropic.claude-3-5-haiku-20241022-v1:0")

# Gemini
agent = Agent(model="gemini/gemini-2.0-flash")

# Ollama (local)
agent = Agent(model="ollama/llama3")

Observability (OpenTelemetry)

Set environment variables — tracing is enabled automatically.

export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
export OTEL_SERVICE_NAME=my-app

Every Agent.run(), Workflow.run(), and tool call emits OTel spans. Compatible with Jaeger, Honeycomb, Grafana Tempo, Datadog, and any OTLP-compatible backend.

API

Agent(*, system, tools, model, max_turns, name)

Parameter Default Description
system "You are a helpful assistant." System prompt
tools {} Dict of {name: callable}
model "gpt-5.4" Any litellm model string
max_turns 10 Max LLM iterations per run() call
name "agent" Name used in traces and pipeline display

Methods:

  • agent.run(user: str) -> str — blocking, returns final answer
  • agent.stream(user: str) -> Generator — yields tokens as they arrive
  • agent.reset() — clears conversation history, keeps system prompt

Workflow

Created automatically by >>. Call .run(input: str) -> str to execute.

workflow = agent_a >> agent_b >> agent_c
result = workflow.run("input")

Tests

# Unit tests (no API calls)
pytest tests/

# Integration tests (requires API credentials)
pytest tests/ -m integration

If using uv, prefix with uv run to ensure the venv's Python is used (avoids conflicts with conda or system Python):

uv run pytest tests/
uv run pytest tests/ -m "not integration"
uv run pytest tests/ -m integration

Why Quark?

Named after the smallest known fundamental particles — quarks need gluons to bind them together. Quark is the minimal binding layer for AI agents.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quark_agents-0.2.0.tar.gz (13.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quark_agents-0.2.0-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file quark_agents-0.2.0.tar.gz.

File metadata

  • Download URL: quark_agents-0.2.0.tar.gz
  • Upload date:
  • Size: 13.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.2.0.tar.gz
Algorithm Hash digest
SHA256 5a04b403c1bce90d996f4529c2bcb68fc6cad98a9286edfeb826d788d57a49ac
MD5 5f7e7811c5c5dddd4cb957a82e3e2614
BLAKE2b-256 410cefdfc15cca094fb13ec5c0aa6642034c1087539ece19523ebe83ee6fc0ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.2.0.tar.gz:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file quark_agents-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: quark_agents-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 8.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e12cd788f0f126b8d22eee11aad0b1878123fe4cddbeb9c165ef1fd8c431d04c
MD5 974761658524eae1415445df80301ccc
BLAKE2b-256 414151a51dc8393b3359f2b4c612509bf34d94b6bc2b8b62d3fe6c19e1582e40

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.2.0-py3-none-any.whl:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page