a uniform python SDK for message generation with LLMs
Project description
Exchange - a uniform python SDK for message generation with LLMs
- Provides a flexible layer for message handling and generation
- Directly integrates python functions into tool calling
- Persistently surfaces errors to the underlying models to support reflection
Example
[!NOTE] Before you can run this example, you need to setup an API key with
export OPENAI_API_KEY=your-key-here
from exchange import Exchange, Message, Tool
from exchange.providers import OpenAiProvider
def word_count(text: str):
"""Get the count of words in text
Args:
text (str): The text with words to count
"""
return len(text.split(" "))
ex = Exchange(
provider=OpenAiProvider.from_env(),
model="gpt-4o",
system="You are a helpful assistant.",
tools=[Tool.from_function(word_count)],
)
ex.add(Message.user("Count the number of words in this current message"))
# The model sees it has a word count tool, and should use it along the way to answer
# This will call all the tools as needed until the model replies with the final result
reply = ex.reply()
print(reply.text)
# you can see all the tool calls in the message history
print(ex.messages)
Plugins
exchange has a plugin mechanism to add support for additional providers and moderators. If you need a provider not supported here, we'd be happy to review contributions. But you can also consider building and using your own plugin.
To create a Provider
plugin, subclass exchange.provider.Provider
. You will need to
implement the complete
method. For example this is what we use as a mock in our tests.
You can see a full implementation example of the OpenAiProvider. We
also generally recommend implementing a from_env
classmethod to instantiate the provider.
class MockProvider(Provider):
def __init__(self, sequence: List[Message]):
# We'll use init to provide a preplanned reply sequence
self.sequence = sequence
self.call_count = 0
def complete(
self, model: str, system: str, messages: List[Message], tools: List[Tool]
) -> Message:
output = self.sequence[self.call_count]
self.call_count += 1
return output
Then use python packaging's entrypoints to register your plugin.
[project.entry-points.'exchange.provider']
example = 'path.to.plugin:ExampleProvider'
Your plugin will then be available in your application or other applications built on exchange through:
from exchange.providers import get_provider
provider = get_provider('example').from_env()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ai_exchange-0.9.1.tar.gz
.
File metadata
- Download URL: ai_exchange-0.9.1.tar.gz
- Upload date:
- Size: 101.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8ee1147eb6c6efa9eec130a0036ffdb0ec7bd92e54f0ce850b82dc8eef81a101 |
|
MD5 | e9a301f7e10cacffffaebf76e410fc1b |
|
BLAKE2b-256 | be7435a958e358129418f65281005a5f04713fa05c0a4db0161fb2bbfab68f77 |
File details
Details for the file ai_exchange-0.9.1-py3-none-any.whl
.
File metadata
- Download URL: ai_exchange-0.9.1-py3-none-any.whl
- Upload date:
- Size: 35.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 74bf4c9524c7b5578823abf31dd8e11753da0f899e746ae4d1bb33d3227226b9 |
|
MD5 | 4b586717fd395707f713ae2252d0d831 |
|
BLAKE2b-256 | d1a4d7a9c8f7143c77a617a10097a8efd0d7f3eed9f75e0ad4409c25fc4cc682 |