Skip to main content

A very opionionated package for creating agents

Project description

Schwarm

An Opinionated Agent Framework inspired by OpenAI's swarm.

Incredibly simple yet amazingly deep, it literally wrote parts of it itself.

Because I'm creative I called it Schwarm. The german word for swarm.

(THIS IS AN ALPHA - Srsly, this is an alpha in the literal sense. This is still a playground for personal agent based PoCs of all kind. Alpha will end 2025 Q1 with the release of a web UI to Schwarm)

Features

  • Extend the capabilities of your agents with "Providers"

    • A zep_provider integrates zep (https://github.com/getzep/zep) into an agent, giving it near-infinite memory with the help of knowledge graphs.
    • An api_provider makes your agent have its own API.
    • Use 100s of LLM providers
    • Budget tracking
    • Token tracking
    • Caching
    • And many more...
  • Don't waste your time designing your agent state machine in so much detail you end up building a normal static service by accident

    I will never understand why people use agents just so they can remove everthing agentic from it with over-engineered graphs. Not on my watch.

    • Let your agents be agents!
    • Give them dynamic instructions.
    • Give them dynamic functions/tools.
    • Let them figure out the rest.
  • Extensive logging and visualization

    • Tell your agents to wait for your approval after every step.
    • Log everything that happens, presented in an actually readable and interpretable way.
    • A modern web UI is coming soon...
  • Lightweight with no overhead

    • Agents are not real objects in your memory calling each other, and being all happy, while they are idling on your VRAM.
    • Nope. it's basically just one agent switching configurations every time it's called.

Quickstart

  1. Install Schwarm:

    pip install schwarm
    
  2. Export your OpenAI API key:

    export OPENAI_API_KEY=sk-xxx
    
  3. Create your agent

    stephen_king_agent = Agent(name="mr_stephen_king", provider_config=LiteLLMConfig(enable_cache=True))
    

    Mr. Stephen King is ready to rock! And has his cache with them! All in one line!

    (Caching means that every message interaction will be cached, so if you would send the same exact prompt to the LLM you would receive the cached answer instead of a newly generated text. Safes money and lets you debug!)

  4. How can I help you?

    Tell it what to do with dynamic instructions that can change every time it's the agent's turn again and carry objects and other data from agent to agent and step to step with the help of context_variables.

    def instruction_stephen_king_agent(context_variables: ContextVariables) -> str:
        """Return the instructions for the Stephen King agent."""
        instruction = """
        You are one of the best authors in the world. You are tasked to write your newest story.
        Execute "write_batch" to write something down to paper.
        Execute "remember_things" to remember things you aren't sure about or to check if something is at odds with previously established facts.
        """
        if "book" in context_variables:
            book = context_variables["book"]
            addendum = f"\n\nYour current story has this many words right now (goal: 10,000): {len(book) / 8}" # highly accurate and performan word counting algorithm
            memory = zep.memory.get("user_agent", min_rating=MIN_FACT_RATING)
            facts = f"\n\nRelevant facts about the story so far:\n{memory.relevant_facts}"
            instruction += addendum + facts
        return instruction
    
    stephen_king_agent.instructions = instruction_stephen_king_agent
    
  5. The toolbox

    Give your agent skills it wouldn’t have otherwise! Also, pass the stick to other agents by setting them in the agent property of the Result object. Just not in this example... Mr. King works alone!

    With such a way of doing handoffs you can implement every state graph you could also build with langgraph. But this way you keep your sanity.

    def write_batch(context_variables: ContextVariables, text: str) -> Result:
        """Write down your story."""
        zep.memory.add(session_id="user_agent", messages=split_text(text))
        context_variables["book"] += text
        return Result(value=f"{text}", context_variables=context_variables, agent=stephen_king_agent)
    
    def remember_things(context_variables: ContextVariables, what_you_want_to_remember: str) -> Result:
        """If you aren’t sure about something that happened in the story, use this tool to remember it."""
        response = zep.memory.search_sessions(
            text=what_you_want_to_remember,
            user_id=user_id,
            search_scope="facts",
            min_fact_rating=MIN_FACT_RATING,
        )
        result = ""
        if response.results:
            for res in response.results:
                result += f"\n{res.fact}"
        return Result(value=f"{result}", context_variables=context_variables, agent=stephen_king_agent)
    
    stephen_king_agent.functions = [write_batch, remember_things]
    

    (Based on the function name, variable names and types and the docstring, a valid OpenAI function spec json gets generated, so this will only work if your model does understand those. Support for other tool specs is coming!)

  6. Kick off!

    input = """
    Write a story set in the SCP universe. It should follow a group of personnel from the SCP Foundation and the adventures their work provides.
    The story should be around 10,000 words long, and should be a mix of horror and science fiction.
    Start by creating an outline for the story, and then write the first chapter.
    """
    
    response = Schwarm().quickstart(stephen_king_agent, input)
    

    Let your agent system loose! Don't worry about losing all your money: with this quickstart configuration, the agent system will ask for your approval before making a money-consuming task.

    (Currently all interaction happnes over the terminal. Slick web app is coming!)

Examples

tbd.

Upcoming

  • more examples and apps
  • a real documentation
  • web ui
  • an extensive arsenal of provider

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

schwarm-0.1.2.tar.gz (40.7 kB view details)

Uploaded Source

Built Distribution

schwarm-0.1.2-py3-none-any.whl (28.9 kB view details)

Uploaded Python 3

File details

Details for the file schwarm-0.1.2.tar.gz.

File metadata

  • Download URL: schwarm-0.1.2.tar.gz
  • Upload date:
  • Size: 40.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.2

File hashes

Hashes for schwarm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 35b533a175d85dba76f50d3ee5dce6b05c0e278973a2c60ba3511e67e84432ae
MD5 81cc079053ed045ddbce39f2d09072c2
BLAKE2b-256 8196809e59a0990d87d1c5aa96364ff33e02b4e0e022c555f8df8e3e82c6d14f

See more details on using hashes here.

File details

Details for the file schwarm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: schwarm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 28.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.2

File hashes

Hashes for schwarm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3b59633c20b2676d41aea8293bef0a475a625d6998d68c81866e687773677ae2
MD5 2f977926f0f2d41c122ff5353154c176
BLAKE2b-256 ad33330349fe5eb9e8e612240900a27a9aee773b9f44088c36c7fb811e024d92

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page