Skip to main content

a uniform python SDK for message generation with LLMs

Project description

ExamplePlugins

Exchange - a uniform python SDK for message generation with LLMs

  • Provides a flexible layer for message handling and generation
  • Directly integrates python functions into tool calling
  • Persistently surfaces errors to the underlying models to support reflection

Example

[!NOTE] Before you can run this example, you need to setup an API key with export OPENAI_API_KEY=your-key-here

from exchange import Exchange, Message, Tool
from exchange.providers import OpenAiProvider

def word_count(text: str):
    """Get the count of words in text

    Args:
        text (str): The text with words to count
    """
    return len(text.split(" "))

ex = Exchange(
    provider=OpenAiProvider.from_env(),
    model="gpt-4o",
    system="You are a helpful assistant.",
    tools=[Tool.from_function(word_count)],
)
ex.add(Message.user("Count the number of words in this current message"))

# The model sees it has a word count tool, and should use it along the way to answer
# This will call all the tools as needed until the model replies with the final result
reply = ex.reply()
print(reply.text)

# you can see all the tool calls in the message history
print(ex.messages)

Plugins

exchange has a plugin mechanism to add support for additional providers and moderators. If you need a provider not supported here, we'd be happy to review contributions. But you can also consider building and using your own plugin.

To create a Provider plugin, subclass exchange.provider.Provider. You will need to implement the complete method. For example this is what we use as a mock in our tests. You can see a full implementation example of the OpenAiProvider. We also generally recommend implementing a from_env classmethod to instantiate the provider.

class MockProvider(Provider):
    def __init__(self, sequence: List[Message]):
        # We'll use init to provide a preplanned reply sequence
        self.sequence = sequence
        self.call_count = 0

    def complete(
        self, model: str, system: str, messages: List[Message], tools: List[Tool]
    ) -> Message:
        output = self.sequence[self.call_count]
        self.call_count += 1
        return output

Then use python packaging's entrypoints to register your plugin.

[project.entry-points.'exchange.provider']
example = 'path.to.plugin:ExampleProvider'

Your plugin will then be available in your application or other applications built on exchange through:

from exchange.providers import get_provider

provider = get_provider('example').from_env()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_exchange-0.9.9.tar.gz (227.8 kB view details)

Uploaded Source

Built Distribution

ai_exchange-0.9.9-py3-none-any.whl (37.4 kB view details)

Uploaded Python 3

File details

Details for the file ai_exchange-0.9.9.tar.gz.

File metadata

  • Download URL: ai_exchange-0.9.9.tar.gz
  • Upload date:
  • Size: 227.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ai_exchange-0.9.9.tar.gz
Algorithm Hash digest
SHA256 0e9c822d1c6eec9fa22866fe3d37a77c14701c1e89afa9e857e40e0e6573c3f9
MD5 439f48b753641d24502423bafb35a7f4
BLAKE2b-256 4344b8b85fe82ce493150144bdcc6eba91dd5b5323e972766d4c18ffc8448f40

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_exchange-0.9.9.tar.gz:

Publisher: publish.yaml on block/goose

Attestations:

File details

Details for the file ai_exchange-0.9.9-py3-none-any.whl.

File metadata

  • Download URL: ai_exchange-0.9.9-py3-none-any.whl
  • Upload date:
  • Size: 37.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ai_exchange-0.9.9-py3-none-any.whl
Algorithm Hash digest
SHA256 4f2ac6cf54cc0993f10d813e6b4db78efea7933b091ad306e10ec9d7f4f6e3dd
MD5 72a6ed68efdf56c85ff554b9b57a2860
BLAKE2b-256 4825409f7c16c8e4a335434db318ab4042dd8737a302e558d63b257d276f477d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_exchange-0.9.9-py3-none-any.whl:

Publisher: publish.yaml on block/goose

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page