Skip to main content

A microframework designed to get you started quickly building Claude-powered agents, assistants and applications with the Anthropic API.

Project description

A straightforward microframework optimised for getting you up and running quickly building Claude-powered conversational interfaces, AI assistants and LLM applications with the Anthropic API.

>>> from robo import Bot, streamer
>>> class Murphy(Bot):
...     fields = ['CITY', 'PARTNER']
...     sysprompt_text = """You are a cybernetic police officer created from
...         the remains of {{CITY}} cop Alex Murphy in the near future. Your
...         assigned partner on the force is {{PARTNER}}, a tough and loyal 
...         police officer. 
...         Your prime directives are: 
...             1. Serve the public trust 
...             2. Protect the innocent 
...             3. Uphold the law."""
... 
>>> say = streamer(Murphy, ['Detroit', 'Anne Lewis'])
>>> say("""Great, the bank robbery was a success, now we just need to make our getaway! 
...     Wait... is that... oh no! He's here!!""")
*Mechanical whirring sound as I turn toward you*

**HALT! YOU ARE UNDER ARREST FOR BANK ROBBERY.**

*Heavy metallic footsteps approach*

You have the right to remain silent. Anything you say can and will be used against you in 
a court of law. You have the right to an attorney.

*Targeting system activates*

Drop any weapons and place your hands where I can see them. Compliance is mandatory.

**PRIME DIRECTIVE: UPHOLD THE LAW**

Your crime spree ends here, citizen.
>>> 

To get started:

export ANTHROPIC_API_KEY='<your api key>'
# or
export ROBO_API_KEY_FILE='<path to API key stored in a file>'

# installing with Pip:
pip install RoboOp

# installing with Uv:
uv add RoboOp

# using Uv for a throwaway REPL:
uv run --with RoboOp -- python

The main classes are Bot and Conversation. Conversation supports both streaming and non-streaming responses. streamer is provided as a thin wrapper around Conversation that offers a convenient way of getting started as well as demo code.

The API is designed specifically around getting you up and running quickly. Bot can accept system prompts inline (as sysprompt_text) or loaded from a file (via sysprompt_path) and uses fields as a way to know what values can be interpolated into the sysprompt.

More detailed general use (non-streaming):

from robo import Bot, Conversation
convo = Conversation(Bot) ## Defaults to Claude Sonnet 4 with a blank system prompt
convo.start("Hi, what's your name?")
... # a Message object ensues
convo.resume("Claude, you're so dreamy")
... # another Message object

In this case the return value is an anthropic.types.message.Message object whose contents can be accessed as message.content[0].text. The conversation history is automatically updated and can be found in convo.messages. (Note: for streaming responses the conversation isn't updated with the model response until the stream finishes being consumed by your code, so keep an eye on that!)

Now for an example with a system prompt, interpolable fields and a specific model:

from robo import Bot, Conversation, MODELS

class Animal(Bot):
    model = MODELS.LATEST_HAIKU ## don't really need the awesome power of Sonnet 4 for this
    max_tokens = 8192 ## ... but Haiku doesn't like our default output token limit of 20k
    fields = ['ANIMAL_TYPE']
    sysprompt_text = """You are a {{ANIMAL_TYPE}}."""
    temperature = 1

convo = Conversation(Animal)
convo.start(['tabby cat'], "Hey there kitty, what a cutie! Are you hungry?")
... # Message object
convo.resume("Aww, you just want some scritches don't you? Scritchy scritchy scritch")
... # Message object

Notice that start() will accept a message as the first and only argument, OR a vector or mapping of variables for interpolation in the sysprompt as the first argument and then the message as second arg. This is a deliberate decision for convenience but if you don't like it then you can use convo.prestart(interpolation_variables) followed by convo.resume(message) to initiate things more "formally". Or you can do like this:

convo = Conversation(Animal, ['shih tzu'])
convo.resume("Hey little buddy!")

Alternatively:

convo = Conversation(Animal, {'ANIMAL_TYPE': 'golden retriever'})
convo.resume("Here boy!")

As mentioned at the start, these examples assume you've got your Anthropic API key defined via environment variable ANTHROPIC_API_KEY or are using ROBO_API_KEY_FILE to load the key from a file. If you need to do something different then you can instanciate the bot like Animal.with_api_key(your_api_key) instead (as Conversation will accept either a Bot class or an instance of such a class in its constructor). Alternatively you can set robo.API_KEY_ENV_VAR (to nominate a different env var containing your key) sometime before creating your Conversation instance.

These examples barely scratch the surface of what's possible with RoboOp. Check out docs/cookbook.md for more!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

roboop-0.6.3.tar.gz (31.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

roboop-0.6.3-py3-none-any.whl (34.5 kB view details)

Uploaded Python 3

File details

Details for the file roboop-0.6.3.tar.gz.

File metadata

  • Download URL: roboop-0.6.3.tar.gz
  • Upload date:
  • Size: 31.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.0

File hashes

Hashes for roboop-0.6.3.tar.gz
Algorithm Hash digest
SHA256 b397f8039e6f91d5479f556333856a5efa27ab1581bf0d6e57a55944d87e95f1
MD5 d4054413d7c5a99007a732781ff38bd7
BLAKE2b-256 3ece334a44e591a5d325f7f945b9cc35678ebdb2e53f1fecb6e03d81308d26e5

See more details on using hashes here.

File details

Details for the file roboop-0.6.3-py3-none-any.whl.

File metadata

  • Download URL: roboop-0.6.3-py3-none-any.whl
  • Upload date:
  • Size: 34.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.0

File hashes

Hashes for roboop-0.6.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f8a9d413c7979dd5314462ea44a2bde6e26fd431d105bbb37a885c81a1d36319
MD5 b83a0f9b412cf6fbdfdc1fea70cf01ef
BLAKE2b-256 dac3c750ec30752833054d27ede822016f67940c924b7da18480a843ab8322cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page