Skip to main content

This is an autogen>=0.4 extension for integrating external models through the OpenAI API.

Project description

autogen-openaiext-client

This Autogen client is to help interface quickly with non-OpenAI LLMs through the OpenAI API.

See here for more information on using with custom LLMs.

This repository simply include clients you can use to initialize your LLMs easily - since the Autogen >v0.4 supports the non-OpenAI LLMs within the autogen_ext package itself with a really nice and clean changes from jackgerrits here.

=======

Install

pip install autogen-openaiext-client

Usage

from autogen_openaiext_client import GeminiChatCompletionClient
import asyncio

# Initialize the client
client = GeminiChatCompletionClient(model="gemini-1.5-flash", api_key=os.environ["GEMINI_API_KEY"])

# use the client like any other autogen client. For example:
result = asyncio.run(
    client.create(
        [UserMessage(content="What is the capital of France?", source="user")]
    )
)
print(result.content)
# Paris

Currently, Gemini, TogetherAI and Groq clients are supported through the GeminiChatCompletionClient, TogetherAIChatCompletionClient and GroqChatCompletionClient respectively.

Install Magentic-One and run python examples/magentic_coder_example.py for a sample usage with other autogen-based frameworks.

Known Issues

  1. Tool calling in Gemini through OpenAI API runs into issues.

Contributing

  1. Adding a new model to existing external providers
    1. For example, adding a new model to GeminiChatCompletionClient includes modifying the GeminiInfo class in info.py and adding the new model to _MODEL_CAPABILITIES and _MODEL_TOKEN_LIMITS dictionaries.
  2. Adding a new external provider
    1. Add a new client class in client.py, relevant ProviderInfo class in info.py and add it to __init__.py for easy import.

Disclaimer

This is a community project for Autogen. Feel free to contribute via issues and PRs and I will try my best to get to it every 3 days.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autogen_openaiext_client-0.0.3.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

autogen_openaiext_client-0.0.3-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file autogen_openaiext_client-0.0.3.tar.gz.

File metadata

File hashes

Hashes for autogen_openaiext_client-0.0.3.tar.gz
Algorithm Hash digest
SHA256 8dcd7ca43f41d8362f2930deeb96c33aacbb5b9a7b366c5ef02e191d5ec8441c
MD5 cf9330052ab3ba83a77c554e0f8f40a5
BLAKE2b-256 4609929af370ec810f2cc653fb9ffe41a9e18bcc5c2c9a4ed4540f2c43a2bb41

See more details on using hashes here.

Provenance

The following attestation bundles were made for autogen_openaiext_client-0.0.3.tar.gz:

Publisher: python-publish.yml on vballoli/autogen-openaiext-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file autogen_openaiext_client-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for autogen_openaiext_client-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6f275fe960de3aef97a9022b7a5899900883728985c9d36494ef2828acbb6153
MD5 973a4a1bac0953fdefbd3d8940110eea
BLAKE2b-256 6fc5a8d744502d05cb8a31ee1bc8e3a7b71a05e837c863f53310f6ea3ffe5bd6

See more details on using hashes here.

Provenance

The following attestation bundles were made for autogen_openaiext_client-0.0.3-py3-none-any.whl:

Publisher: python-publish.yml on vballoli/autogen-openaiext-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page