Python module, that provides a simple way to create and manage ChatGPT tools by python decorators
Project description
openai_tools_decorator
A lightweight Python library that streamlines creating and invoking “tools” (functions) in your OpenAI ChatCompletion-based projects. It lets you register and call both synchronous and asynchronous functions via decorators.
Installation
pip install openai_tools_decorator
Quick Start
1. Import and Initialization
from openai_tools_decorator import OpenAIT
client = OpenAIT()
2. Adding Tools
Wrap your function (sync or async) with @client.add_tool(...). The decorator registers it and provides a JSON Schema describing the function’s parameters:
@client.add_tool(
{
"description": "Get current weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city name in English"}
},
"required": ["city"]
},
}
)
def get_weather(city: str):
# Or async def get_weather(...) if you prefer
return f"Weather in {city}: 25°C"
3. Using Tools with Chat
When you call run_with_tool(...) or run_with_tool_by_thread_id(...), the ChatCompletion model can opt to invoke any matching tool. For example:
user_input = "How cold is it in Moscow right now?"
response = await client.run_with_tool(
user_input,
messages=[],
model="gpt-4o"
)
print(response) # The assistant’s response, possibly including a tool call
Example
import asyncio
import aiohttp
from openai_tools_decorator import OpenAIT
client = OpenAIT()
api_key = "<YOUR_API_KEY>"
async def fetch_url(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as resp:
return await resp.text()
@client.add_tool(
{
"description": "Fetch weather from an API",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The city name in English"
}
},
"required": ["city"]
},
}
)
async def get_weather(city: str):
url = f"https://api.openweathermap.org/data/2.5/weather?q={city}&appid={api_key}&units=metric"
return await fetch_url(url)
async def main():
question = "What's the temperature in London?"
result = await client.run_with_tool(question, messages=[])
print("Assistant says:", result)
asyncio.run(main())
Key Points
- You can decorate both synchronous and asynchronous functions.
- Tools get automatically registered and described for the OpenAI model.
- The model decides whether or not to call your tool during the conversation.
License
Distributed under MIT or any other license of your choice. Contributions and feedback are always welcome!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_tools_decorator-1.0.1.tar.gz.
File metadata
- Download URL: openai_tools_decorator-1.0.1.tar.gz
- Upload date:
- Size: 9.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
18b16b16f6c4a4ea66ccb5659380d8f0a90683404b26c0f9d3d140485fccce15
|
|
| MD5 |
fa8f928a83cf605a789189b1dde7ac87
|
|
| BLAKE2b-256 |
60fb47e990f9c61fbd92b913ebfd89f5a116362208dd58d78acf62bbfbdafd2c
|
File details
Details for the file openai_tools_decorator-1.0.1-py3-none-any.whl.
File metadata
- Download URL: openai_tools_decorator-1.0.1-py3-none-any.whl
- Upload date:
- Size: 8.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8881c8b0a9061c8f807e2eded5fb1566604150eaa78d4ed478535f7740fe48be
|
|
| MD5 |
6a6c98c8f5ded2289c464cc0d55a0fe2
|
|
| BLAKE2b-256 |
bd00e11dac14b2a824a8c56b1e2e27dbdb56bd778ecc3b64caabd8c28f978449
|