This is an autogen>=0.4 extension for integrating external models through the OpenAI API.
Project description
autogen-openaiext-client
This Autogen client is to help interface quickly with non-OpenAI LLMs through the OpenAI API.
See here for more information on using with custom LLMs.
This repository simply include clients you can use to initialize your LLMs easily - since the Autogen >v0.4 supports the non-OpenAI LLMs within the
autogen_ext
package itself with a really nice and clean changes from jackgerrits here.
=======
Install
pip install autogen-openaiext-client
Usage
from autogen_openaiext_client import GeminiChatCompletionClient
import asyncio
# Initialize the client
client = GeminiChatCompletionClient(model="gemini-1.5-flash", api_key=os.environ["GEMINI_API_KEY"])
# use the client like any other autogen client. For example:
result = asyncio.run(
client.create(
[UserMessage(content="What is the capital of France?", source="user")]
)
)
print(result.content)
# Paris
Currently, Gemini
, TogetherAI
and Groq
clients are supported through the GeminiChatCompletionClient
, TogetherAIChatCompletionClient
and GroqChatCompletionClient
respectively.
Install Magentic-One and run python examples/magentic_coder_example.py
for a sample usage with other autogen-based frameworks.
Known Issues
- Tool calling in Gemini through OpenAI API runs into issues.
Contributing
- Adding a new model to existing external providers
- For example, adding a new model to
GeminiChatCompletionClient
includes modifying theGeminiInfo
class ininfo.py
and adding the new model to_MODEL_CAPABILITIES
and_MODEL_TOKEN_LIMITS
dictionaries.
- For example, adding a new model to
- Adding a new external provider
- Add a new client class in
client.py
, relevantProviderInfo
class ininfo.py
and add it to__init__.py
for easy import.
- Add a new client class in
Disclaimer
This is a community project for Autogen. Feel free to contribute via issues and PRs and I will try my best to get to it every 3 days.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file autogen_openaiext_client-0.0.3.tar.gz
.
File metadata
- Download URL: autogen_openaiext_client-0.0.3.tar.gz
- Upload date:
- Size: 7.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.0.1 CPython/3.12.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8dcd7ca43f41d8362f2930deeb96c33aacbb5b9a7b366c5ef02e191d5ec8441c |
|
MD5 | cf9330052ab3ba83a77c554e0f8f40a5 |
|
BLAKE2b-256 | 4609929af370ec810f2cc653fb9ffe41a9e18bcc5c2c9a4ed4540f2c43a2bb41 |
Provenance
The following attestation bundles were made for autogen_openaiext_client-0.0.3.tar.gz
:
Publisher:
python-publish.yml
on vballoli/autogen-openaiext-client
-
Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
autogen_openaiext_client-0.0.3.tar.gz
- Subject digest:
8dcd7ca43f41d8362f2930deeb96c33aacbb5b9a7b366c5ef02e191d5ec8441c
- Sigstore transparency entry: 164684568
- Sigstore integration time:
- Permalink:
vballoli/autogen-openaiext-client@dd89d1f1f331f7ae70e737131f484cba5cb59a06
- Branch / Tag:
refs/tags/0.0.3
- Owner: https://github.com/vballoli
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
python-publish.yml@dd89d1f1f331f7ae70e737131f484cba5cb59a06
- Trigger Event:
release
- Statement type:
File details
Details for the file autogen_openaiext_client-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: autogen_openaiext_client-0.0.3-py3-none-any.whl
- Upload date:
- Size: 6.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.0.1 CPython/3.12.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6f275fe960de3aef97a9022b7a5899900883728985c9d36494ef2828acbb6153 |
|
MD5 | 973a4a1bac0953fdefbd3d8940110eea |
|
BLAKE2b-256 | 6fc5a8d744502d05cb8a31ee1bc8e3a7b71a05e837c863f53310f6ea3ffe5bd6 |
Provenance
The following attestation bundles were made for autogen_openaiext_client-0.0.3-py3-none-any.whl
:
Publisher:
python-publish.yml
on vballoli/autogen-openaiext-client
-
Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
autogen_openaiext_client-0.0.3-py3-none-any.whl
- Subject digest:
6f275fe960de3aef97a9022b7a5899900883728985c9d36494ef2828acbb6153
- Sigstore transparency entry: 164684573
- Sigstore integration time:
- Permalink:
vballoli/autogen-openaiext-client@dd89d1f1f331f7ae70e737131f484cba5cb59a06
- Branch / Tag:
refs/tags/0.0.3
- Owner: https://github.com/vballoli
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
python-publish.yml@dd89d1f1f331f7ae70e737131f484cba5cb59a06
- Trigger Event:
release
- Statement type: