Skip to main content

Minimal Python framework for composing agents, tools, and multi-agent workflows

Project description

Quark

A <300-line Python agentic framework. Define agents with a system prompt and tools, then compose them into pipelines using the >> operator — just like Airflow, but for LLMs. Provider-agnostic via litellm.

Install

# From PyPI (once available)
pip install quark-agents

# From source
git clone https://github.com/awslabs/quark-agents
cd quark-agents
pip install .

# With OpenTelemetry support
pip install "quark-agents[otel]"

Install with uv

git clone https://github.com/awslabs/quark-agents
cd quark-agents
uv venv
source .venv/bin/activate

# Core + dev dependencies (pytest, mkdocs)
uv pip install ".[dev]"

# With OpenTelemetry
uv pip install ".[dev,otel]"

# With AWS Bedrock support
uv pip install ".[dev,bedrock]"

# All extras
uv pip install ".[dev,otel,bedrock]"

Note: Editable installs (-e) require setuptools>=75. If you see ModuleNotFoundError: No module named 'setuptools.backends', make sure pyproject.toml has requires = ["setuptools>=75"] under [build-system], or use a non-editable install (uv pip install ".[dev]" without -e).

Usage

Single agent

from quark import Agent

agent = Agent(
    system="You are a helpful assistant.",
    model="gpt-5.4",  # or any litellm-supported model
    name="assistant",
)

print(agent.run("What is the capital of France?"))

Agent with tools

def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"Sunny, 22°C in {city}"

agent = Agent(
    system="You are a weather assistant.",
    model="gpt-5.4",
    tools={"get_weather": get_weather},
)

print(agent.run("What's the weather in Paris?"))

Pipelines with >>

Chain agents and plain functions using >>. Output of each step becomes input to the next.

from quark import Agent

def fetch_article(url: str) -> str:
    """Fetch article content from a URL."""
    return "..."  # your fetch logic

summarizer = Agent(system="Summarize the article in 3 bullet points.", name="summarizer")
critic     = Agent(system="List 2 weaknesses in this summary.", name="critic")
editor     = Agent(system="Write a final improved summary given the feedback.", name="editor")

pipeline = fetch_article >> summarizer >> critic >> editor
result = pipeline.run("https://example.com/article")

Parallel fan-out with lists

Wrap steps in a list to run them in parallel. Their outputs are combined and passed to the next step.

pipeline = fetch_article >> summarizer >> [critic, fact_checker] >> editor
result = pipeline.run("https://example.com/article")

Composing workflows

research = fetch_article >> summarizer
review   = [critic, fact_checker] >> editor

pipeline = research >> review
result = pipeline.run("https://example.com/article")

Streaming

for chunk in agent.stream("Tell me a story."):
    print(chunk, end="", flush=True)

Provider-agnostic

# OpenAI
agent = Agent(model="gpt-5.4")

# Anthropic
agent = Agent(model="claude-opus-4-6")

# AWS Bedrock
agent = Agent(model="bedrock/anthropic.claude-3-5-haiku-20241022-v1:0")

# Gemini
agent = Agent(model="gemini/gemini-2.0-flash")

# Ollama (local)
agent = Agent(model="ollama/llama3")

Observability (OpenTelemetry)

Set environment variables — tracing is enabled automatically.

export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
export OTEL_SERVICE_NAME=my-app

Every Agent.run(), Workflow.run(), and tool call emits OTel spans. Compatible with Jaeger, Honeycomb, Grafana Tempo, Datadog, and any OTLP-compatible backend.

API

Agent(*, system, tools, model, max_turns, name)

Parameter Default Description
system "You are a helpful assistant." System prompt
tools {} Dict of {name: callable}
model "gpt-5.4" Any litellm model string
max_turns 10 Max LLM iterations per run() call
name "agent" Name used in traces and pipeline display

Methods:

  • agent.run(user: str) -> str — blocking, returns final answer
  • agent.stream(user: str) -> Generator — yields tokens as they arrive
  • agent.reset() — clears conversation history, keeps system prompt

Workflow

Created automatically by >>. Call .run(input: str) -> str to execute.

workflow = agent_a >> agent_b >> agent_c
result = workflow.run("input")

Tests

# Unit tests (no API calls)
pytest tests/

# Integration tests (requires API credentials)
pytest tests/ -m integration

If using uv, prefix with uv run to ensure the venv's Python is used (avoids conflicts with conda or system Python):

uv run pytest tests/
uv run pytest tests/ -m "not integration"
uv run pytest tests/ -m integration

Why Quark?

Named after the smallest known fundamental particles — quarks need gluons to bind them together. Quark is the minimal binding layer for AI agents.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quark_agents-0.1.0.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quark_agents-0.1.0-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file quark_agents-0.1.0.tar.gz.

File metadata

  • Download URL: quark_agents-0.1.0.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a8ddd8f4603e060f65bfd01a17ac963a90f288e69a945eebd72f32ffb5dd1b4d
MD5 b6b04b39adfb4bbf6a5979d5fae66614
BLAKE2b-256 a45a4af5e40b115a110c01a4fc6a3acf1e238bff9f170552f05180ce77d46062

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.1.0.tar.gz:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file quark_agents-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: quark_agents-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quark_agents-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a4769e8b5f90d608481b3b0db065db86ac508ae51eb1915bb32c19b11ee7923a
MD5 0e368e8c49088bbbd0822a0b9ffd3c23
BLAKE2b-256 f4a5a2b013b54b45a4243513757cc35d04a2cdb9bceca8822df6fd3098aa1f73

See more details on using hashes here.

Provenance

The following attestation bundles were made for quark_agents-0.1.0-py3-none-any.whl:

Publisher: publish.yml on awslabs/quark-agents

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page