Claudette is Claude's friend
Project description
claudette
Claudette is a wrapper for Anthropic’s Python SDK.
TODO: This README is incomplete.
Install
pip install claudette
Getting started
Anthropic’s Python SDK will automatically be installed with Claudette, if you don’t already have it.
You’ll need to set the ANTHROPIC_API_KEY
environment variable to the
key provided by Anthropic.
import os
# os.environ['ANTHROPIC_LOG'] = 'debug'
To print every HTTP request and response in full, uncomment the above line.
from claudette import *
Claudette only exports the symbols that are needed to use the library,
so you can use import *
to import them. Alternatively, just use:
import claudette
…and then add the prefix claudette.
to any usages of the module.
models
('claude-3-opus-20240229',
'claude-3-sonnet-20240229',
'claude-3-haiku-20240307')
These are the models currently available from the SDK.
model = models[-1]
For examples, we’ll use Haiku, since it’s fast and cheap (and surprisingly good!)
Chat
Chat
Chat (model:Optional[str]=None, cli:Optional[claudette.core.Client]=None, sp='', tools:Optional[list]=None)
Anthropic chat client.
Type | Default | Details | |
---|---|---|---|
model | Optional | None | Model to use (leave empty if passing cli ) |
cli | Optional | None | Client to use (leave empty if passing model ) |
sp | str | Optional system prompt | |
tools | Optional | None | List of tools to make available to Claude |
chat = Chat(model, sp="You are a helpful assistant.")
Chat.__call__
Chat.__call__ (pr, sp='', temp=0, maxtok=4096, stop:Optional[list[str]]=None, ns:Optional[collections.abc.Mapping]=None, prefill='', **kw)
Add prompt pr
to dialog and get a response from Claude
Type | Default | Details | |
---|---|---|---|
pr | Prompt / message | ||
sp | str | The system prompt | |
temp | int | 0 | Temperature |
maxtok | int | 4096 | Maximum tokens |
stop | Optional | None | Stop sequences |
ns | Optional | None | Namespace to search for tools, defaults to globals() |
prefill | str | Optional prefill to pass to Claude as start of its response | |
kw |
chat("I'm Jeremy")
contents(chat("What's my name?"))
'Your name is Jeremy, as you told me earlier.'
Claude supports adding an extra assistant
message at the end, which
contains the prefill – i.e. the text we want Claude to assume the
response starts with.
Let’s try it out:
q = "Concisely, what is the meaning of life?"
pref = 'According to Douglas Adams,'
chat(q, prefill=pref)
According to Douglas Adams, “The answer to the ultimate question of life, the universe, and everything is 42.”
- id: msg_011BL35YKAgwg8UR7nKjM1p2
- content: [{‘text’: ‘According to Douglas Adams, “The answer to the ultimate question of life, the universe, and everything is 42.”’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 109, ‘output_tokens’: 23}
Chat.stream
Chat.stream (pr, sp='', temp=0, maxtok=4096, stop:Optional[list[str]]=None, prefill='', **kw)
Add prompt pr
to dialog and stream the response from Claude
Type | Default | Details | |
---|---|---|---|
pr | Prompt / message | ||
sp | str | The system prompt | |
temp | int | 0 | Temperature |
maxtok | int | 4096 | Maximum tokens |
stop | Optional | None | Stop sequences |
prefill | str | Optional prefill to pass to Claude as start of its response | |
kw |
for o in chat.stream("And what is the question?"): print(o, end='')
Unfortunately, the book never explicitly states what the "ultimate question" is that corresponds to the answer of 42. That remains a mystery in the Hitchhiker's Guide to the Galaxy series. The meaning of life is left open to interpretation.
Tool use
sp = "If asked to add things up, use the `sums` function instead of doing it yourself. Never mention what tools you use."
We automagically get streamlined tool use as well:
pr = f"What is {a}+{b}?"
pr
'What is 604542+6458932?'
chat = Chat(model, sp=sp, tools=[sums])
r = chat(pr)
r
ToolUseBlock(id=‘toolu_018m6yuZwQtn7xZozny37CrZ’, input={‘a’: 604542, ‘b’: 6458932}, name=‘sums’, type=‘tool_use’)
- id: msg_01MSiGKYedwdpr41VciqydB7
- content: [{‘id’: ‘toolu_018m6yuZwQtn7xZozny37CrZ’, ‘input’: {‘a’: 604542, ‘b’: 6458932}, ‘name’: ‘sums’, ‘type’: ‘tool_use’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: tool_use
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 418, ‘output_tokens’: 72}
chat(r)
The sum of 604542 and 6458932 is 7063474.
- id: msg_016NBFCx5L3HMvY5kwVDdjDE
- content: [{‘text’: ‘The sum of 604542 and 6458932 is 7063474.’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 505, ‘output_tokens’: 23}
It should be correct, because it actually used our Python function to do the addition. Let’s check:
a+b
7063474
Images
Claude can handle image data as well. As everyone knows, when testing image APIs you have to use a cute puppy.
# Image is Cute_dog.jpg from Wikimedia
fn = Path('puppy.jpg')
display.Image(filename=fn, width=200)
img = fn.read_bytes()
img_msg
img_msg (data:bytes)
Convert image data
into an encoded dict
Anthropic have documented the particular dict
structure that expect
image data to be in, so we have a little function to create that for us.
text_msg
text_msg (s:str)
Convert s
to a text message
A Claude message can be a list of image and text parts. So we’ve also created a helper for making the text parts.
q = "In brief, what color flowers are in this image?"
msg = mk_msg([img_msg(img), text_msg(q)])
c([msg])
The image contains purple and yellow daisy-like flowers, which appear to be daisies or a similar type of flower.
- id: msg_01GSzzitXbvkzEJtfJquzSXE
- content: [{‘text’: ‘The image contains purple and yellow daisy-like flowers, which appear to be daisies or a similar type of flower.’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1665, ‘output_tokens’: 29}
There’s not need to manually choose the type of message, since we figure that out from the data of the source data.
_mk_content('Hi')
{'type': 'text', 'text': 'Hi'}
c([[img, q]])
The image contains purple and yellow daisy-like flowers, which appear to be daisies or a similar type of flower.
- id: msg_01ArrMvaZoXa1JTjULMentQJ
- content: [{‘text’: ‘The image contains purple and yellow daisy-like flowers, which appear to be daisies or a similar type of flower.’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1665, ‘output_tokens’: 29}
Claude also supports uploading an image without any text, in which case
it’ll make a general comment about what it sees. You can then use
Chat
to ask
questions:
chat = Chat(model, sp=sp)
chat(img)
The image shows a cute puppy, likely a Cavalier King Charles Spaniel, sitting in a grassy area surrounded by purple daisy flowers. The puppy has a friendly, curious expression on its face as it gazes directly at the camera. The contrast between the puppy’s soft, fluffy fur and the vibrant flowers creates a charming and picturesque scene.
- id: msg_01535kuKhiN6Do5PTcTmTst7
- content: [{‘text’: “The image shows a cute puppy, likely a Cavalier King Charles Spaniel, sitting in a grassy area surrounded by purple daisy flowers. The puppy has a friendly, curious expression on its face as it gazes directly at the camera. The contrast between the puppy’s soft, fluffy fur and the vibrant flowers creates a charming and picturesque scene.”, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1681, ‘output_tokens’: 83}
chat('What direction is the puppy facing?')
The puppy in the image is facing the camera directly, looking straight ahead with a curious expression.
- id: msg_01Ge4M4Z4J6ywg9V8cCXy2aN
- content: [{‘text’: ‘The puppy in the image is facing the camera directly, looking straight ahead with a curious expression.’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1775, ‘output_tokens’: 23}
chat('What color is it?')
The puppy in the image has a combination of colors - it has a white and brown/tan coat. The head and ears appear to be a reddish-brown color, while the body is mostly white with some tan/brown patches.
- id: msg_01JbUH6MvqWMvkF8UJVjo33z
- content: [{‘text’: ‘The puppy in the image has a combination of colors - it has a white and brown/tan coat. The head and ears appear to be a reddish-brown color, while the body is mostly white with some tan/brown patches.’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 1806, ‘output_tokens’: 53}
mk_msg
mk_msg (content, role='user', **kw)
Helper to create a dict
appropriate for a Claude message. kw
are
added as key/value pairs to the message
Type | Default | Details | |
---|---|---|---|
content | A string, list, or dict containing the contents of the message | ||
role | str | user | Must be ‘user’ or ‘assistant’ |
kw |
mk_msgs
mk_msgs (msgs:list, **kw)
Helper to set ‘assistant’ role on alternate messages.
Client
Client (model, cli=None)
Basic Anthropic messages client.
Client.__call__
Client.__call__ (msgs:list, sp='', temp=0, maxtok=4096, stop:Optional[list[str]]=None, **kw)
Make a call to Claude without streaming.
Type | Default | Details | |
---|---|---|---|
msgs | list | List of messages in the dialog | |
sp | str | The system prompt | |
temp | int | 0 | Temperature |
maxtok | int | 4096 | Maximum tokens |
stop | Optional | None | Stop sequences |
kw |
Defining __call__
let’s us use an object like a function (i.e it’s
callable). We use it as a small wrapper over messages.create
.
c('Hi')
Hello! How can I assist you today?
- id: msg_01Vr6t6QdodntSMvHthnRDBc
- content: [{‘text’: ‘Hello! How can I assist you today?’, ‘type’: ‘text’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: end_turn
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 8, ‘output_tokens’: 12}
c.use
In: 18; Out: 64; Total: 82
Client.stream
Client.stream (msgs:list, sp='', temp=0, maxtok=4096, stop:Optional[list[str]]=None, **kw)
Make a call to Claude, streaming the result.
Type | Default | Details | |
---|---|---|---|
msgs | list | List of messages in the dialog | |
sp | str | The system prompt | |
temp | int | 0 | Temperature |
maxtok | int | 4096 | Maximum tokens |
stop | Optional | None | Stop sequences |
kw |
We also define a wrapper over messages.stream
, which is like
messages.create
, but streams the response back incrementally.
for o in c.stream('Hi'): print(o, end='')
Hello! How can I assist you today?
Tool use
Tool use lets Claude use external tools.
We’ll use docments to make defining Python functions as ergonomic as possible. Each parameter (and the return value) should have a type, and a docments comment with the description of what it is. As an example we’ll write a simple function that adds numbers together:
def sums(
# First thing to sum
a:int,
# Second thing to sum
b:int=1
# The sum of the inputs
) -> int:
"Adds a + b."
return a + b
get_schema
get_schema (f:<built-infunctioncallable>)
Convert function f
into a JSON schema dict
for tool use.
a,b = 604542,6458932
pr = f"What is {a}+{b}?"
sp = "You must use the `sums` function instead of adding yourself, but don't mention what tools you use."
tools=[get_schema(sums)]
We’ll start a dialog with Claude now. We’ll store the messages of our
dialog in msgs
. The first message will be our prompt pr
, and we’ll
pass our tools
schema.
msgs = mk_msgs(pr)
r = c(msgs, sp=sp, tools=tools)
r
ToolUseBlock(id=‘toolu_01CsuZfPAas75MkDABXAvjWD’, input={‘a’: 604542, ‘b’: 6458932}, name=‘sums’, type=‘tool_use’)
- id: msg_01StvQvvrnwaBtuUwHQLrpFt
- content: [{‘id’: ‘toolu_01CsuZfPAas75MkDABXAvjWD’, ‘input’: {‘a’: 604542, ‘b’: 6458932}, ‘name’: ‘sums’, ‘type’: ‘tool_use’}]
- model: claude-3-haiku-20240307
- role: assistant
- stop_reason: tool_use
- stop_sequence: None
- type: message
- usage: {‘input_tokens’: 414, ‘output_tokens’: 72}
When Claude decides that it should use a tool, it passes back a
ToolUseBlock
with the name of the tool to call, and the params to use.
We need to append the response to the dialog so Claude knows what’s happening (since it’s stateless).
msgs.append(mk_msg(r))
We don’t want to allow it to call just any possible function (that would be a security disaster!) so we create a namespace – that is, a dictionary of allowable function names to call.
call_func
call_func (tr:collections.abc.Mapping, ns:Optional[collections.abc.Mapping]=None)
Call the function in the tool response tr
, using namespace ns
.
Type | Default | Details | |
---|---|---|---|
tr | Mapping | Tool use request response from Claude | |
ns | Optional | None | Namespace to search for tools, defaults to globals() |
We can now use the function requested by Claude. We look it up in ns
,
and pass in the provided parameters.
res = call_func(r, ns=ns)
res
7063474
mk_toolres
mk_toolres (r:collections.abc.Mapping, res=None, ns:Optional[collections.abc.Mapping]=None)
Create a tool_result
message from response r
.
Type | Default | Details | |
---|---|---|---|
r | Mapping | Tool use request response from Claude | |
res | NoneType | None | The result of calling the tool (calculated with call_func by default) |
ns | Optional | None | Namespace to search for tools |
In order to tell Claude the result of the tool call, we pass back a
tool_result
message, created by calling
call_func
.
tr = mk_toolres(r, res=res, ns=ns)
tr
{'role': 'user',
'content': [{'type': 'tool_result',
'tool_use_id': 'toolu_01CsuZfPAas75MkDABXAvjWD',
'content': '7063474'}]}
We add this to our dialog, and now Claude has all the information it needs to answer our question.
msgs.append(tr)
contents(c(msgs, sp=sp, tools=tools))
'The sum of 604542 and 6458932 is 7063474.'
XML helpers
Claude works well with XML inputs, but XML can be a bit clunky to work with manually. Therefore, we create a couple of more streamlined approaches for XML generation. You don’t need to use these if you don’t find them useful – you can always just use plain strings for XML directly.
xt
xt (tag:str, c:Optional[list]=None, **kw)
Helper to create appropriate data structure for
to_xml
.
Type | Default | Details | |
---|---|---|---|
tag | str | XML tag name | |
c | Optional | None | Children |
kw |
An XML node contains a tag, optional children, and optional attributes.
xt
creates a
tuple of these three things, which we will use to general XML shortly.
Attributes are passed as kwargs; since these might conflict with
reserved words in Python, you can optionally add a _
prefix and it’ll
be stripped off.
xt('x-custom', ['hi'], _class='bar')
('x-custom', ['hi'], {'class': 'bar'})
from claudette.core import div,img,h1,h2,p,hr,html
If you have to use a lot of tags of the same type, it’s convenient to
use partial
to create specialised functions for them. Here, we’re
creating functions for some common HTML tags. Here’s an example of using
them:
a = html([
p('This is a paragraph'),
hr(),
img(src='http://example.prg'),
div([
h1('This is a header'),
h2('This is a sub-header', style='k:v'),
], _class='foo')
])
a
('html',
[('p', 'This is a paragraph', {}),
('hr', None, {}),
('img', None, {'src': 'http://example.prg'}),
('div',
[('h1', 'This is a header', {}),
('h2', 'This is a sub-header', {'style': 'k:v'})],
{'class': 'foo'})],
{})
hl_md
hl_md (s, lang='xml')
Syntax highlight s
using lang
.
When we display XML in a notebook, it’s nice to highlight it, so we create a function to simplify that:
hl_md('<test><xml foo="bar">a child</xml></test>')
<test><xml foo="bar">a child</xml></test>
to_xml
to_xml (node:tuple, hl=False)
Convert node
to an XML string.
Type | Default | Details | |
---|---|---|---|
node | tuple | XML structure in xt format |
|
hl | bool | False | Syntax highlight response? |
Now we can convert that HTML data structure we created into XML:
to_xml(a, hl=True)
<html>
<p>This is a paragraph</p>
<hr />
<img src="http://example.prg" />
<div class="foo">
<h1>This is a header</h1>
<h2 style="k:v">This is a sub-header</h2>
</div>
</html>
json_to_xml
json_to_xml (d:dict, rnm:str)
Convert d
to XML.
Type | Details | |
---|---|---|
d | dict | JSON dictionary to convert |
rnm | str | Root name |
Returns | str |
JSON doesn’t map as nicely to XML as the data structure used in the previous section, but for simple XML trees it can be convenient – for example:
a = dict(surname='Howard', firstnames=['Jeremy','Peter'],
address=dict(state='Queensland',country='Australia'))
hl_md(json_to_xml(a, 'person'))
<person>
<surname>Howard</surname>
<firstnames>
<item>Jeremy</item>
<item>Peter</item>
</firstnames>
<address>
<state>Queensland</state>
<country>Australia</country>
</address>
</person>
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for claudette-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d480f7fee23bd8768ea5d31ddf69f95db231bb1a50ca155e6b4fa439d5d39c65 |
|
MD5 | dd3ef453683c49b6cb0ab8013313cd21 |
|
BLAKE2b-256 | 6ea3d6eebe1a20cfa94d4d0a6b5245ac588266e9caa653f9b5d11838565e2199 |