Skip to main content

Minimal Python framework for composing agents, tools, and multi-agent workflows

Project description

Quark Agents

Experimental. An ongoing exploration into the simplest possible agentic framework — use it to learn, hack, and break agentic things.

Minimal Python framework for composing agents, tools, and multi-agent workflows. Define agents with a system prompt and tools, then compose them using the >> operator. Provider-agnostic via litellm.

Install

pip install quark-agents

# From source
git clone https://github.com/awslabs/quark-agents
cd quark-agents
pip install .

# With OpenTelemetry support
pip install "quark-agents[otel]"

Install with uv

git clone https://github.com/awslabs/quark-agents
cd quark-agents
uv venv
source .venv/bin/activate

# Core + dev dependencies (pytest, mkdocs)
uv pip install ".[dev]"

# With OpenTelemetry
uv pip install ".[dev,otel]"

# With AWS Bedrock support
uv pip install ".[dev,bedrock]"

# All extras
uv pip install ".[dev,otel,bedrock]"

Note: Editable installs (-e) require setuptools>=75. If you see ModuleNotFoundError: No module named 'setuptools.backends', make sure pyproject.toml has requires = ["setuptools>=75"] under [build-system], or use a non-editable install (uv pip install ".[dev]" without -e).

Usage

Single agent

from quark import Agent

agent = Agent(
    system="You are a helpful assistant.",
    model="gpt-5.4",  # or any litellm-supported model
    name="assistant",
)

print(agent.run("What is the capital of France?"))

Agent with tools

def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    system="You are a weather assistant.",
    model="gpt-5.4",
    tools={"get_weather": get_weather},
)

print(agent.run("What's the weather in Paris?"))

Pipelines with >>

Chain agents and plain functions using >>. Output of each step becomes input to the next.

from quark import Agent

def fetch_article(url: str) -> str:
    """Fetch article content from a URL."""
    return "..."  # your fetch logic

summarizer = Agent(system="Summarize the article in 3 bullet points.", name="summarizer")
critic     = Agent(system="List 2 weaknesses in this summary.", name="critic")
editor     = Agent(system="Write a final improved summary given the feedback.", name="editor")

pipeline = fetch_article >> summarizer >> critic >> editor
result = pipeline.run("https://example.com/article")

Parallel fan-out with lists

Wrap steps in a list to run them in parallel. Their outputs are combined and passed to the next step.

pipeline = fetch_article >> summarizer >> [critic, fact_checker] >> editor
result = pipeline.run("https://example.com/article")

Composing workflows

research = fetch_article >> summarizer
review   = [critic, fact_checker] >> editor

pipeline = research >> review
result = pipeline.run("https://example.com/article")

Streaming

for chunk in agent.stream("Tell me a story."):
    print(chunk, end="", flush=True)

Provider-agnostic

# OpenAI
agent = Agent(model="gpt-5.4")

# Anthropic
agent = Agent(model="claude-opus-4-6")

# AWS Bedrock
agent = Agent(model="bedrock/anthropic.claude-3-5-haiku-20241022-v1:0")

# Gemini
agent = Agent(model="gemini/gemini-2.0-flash")

# Ollama (local)
agent = Agent(model="ollama/llama3")

Observability (OpenTelemetry)

Set environment variables — tracing is enabled automatically.

export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
export OTEL_SERVICE_NAME=my-app

Every Agent.run(), Workflow.run(), and tool call emits OTel spans. Compatible with Jaeger, Honeycomb, Grafana Tempo, Datadog, and any OTLP-compatible backend.

API

Agent(*, system, tools, model, max_turns, name)

Parameter Default Description
system "You are a helpful assistant." System prompt
tools {} Dict of {name: callable}
model "gpt-5.4" Any litellm model string
max_turns 10 Max LLM iterations per run() call
name "agent" Name used in traces and pipeline display

Methods:

  • agent.run(user: str) -> str — blocking, returns final answer
  • agent.stream(user: str) -> Generator — yields tokens as they arrive
  • agent.reset() — clears conversation history, keeps system prompt

Workflow

Created automatically by >>. Call .run(input: str) -> str to execute.

workflow = agent_a >> agent_b >> agent_c
result = workflow.run("input")

Tests

# Unit tests (no API calls)
pytest tests/

# Integration tests (requires API credentials)
pytest tests/ -m integration

If using uv, prefix with uv run to ensure the venv's Python is used (avoids conflicts with conda or system Python):

uv run pytest tests/
uv run pytest tests/ -m "not integration"
uv run pytest tests/ -m integration

Why Quark?

Named after the smallest known fundamental particles — quarks need gluons to bind them together. Quark is the minimal binding layer for AI agents.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quark_agents-0.1.2.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quark_agents-0.1.2-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file quark_agents-0.1.2.tar.gz.

File metadata

  • Download URL: quark_agents-0.1.2.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.1.2.tar.gz
Algorithm Hash digest
SHA256 85595bc45c81cff69fed49348f2e4c7b3af1607dffb9744fcfd36465918c9919
MD5 17aa665f2c3973643643c22a2635d3eb
BLAKE2b-256 a65ffd40b17cd0eaa7be18e38dfa284fea917bea62038defe5085c0274710c94

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.1.2.tar.gz:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file quark_agents-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: quark_agents-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2a1f5abcc885997dbee5dd1db1b0a040d501a14c7fa3354b2c94e6cb9b203ab2
MD5 2fac368246d9b968cc8f66baf75d817a
BLAKE2b-256 e83dc30f92e76053fd05e329f876bb36c4758f2901729ab36a0d97a4cdeb7e8b

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.1.2-py3-none-any.whl:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page