Python module for asynchronous interaction with OpenAI
Project description
Technologies
- Python >= 3.8;
- aiohttp >= 3.8
quick start
- Example #1 Chat:
import asyncio
from ai_openchat import AsyncOpenAI
from ai_openchat.base.model import Model
async def chat():
openai_client = AsyncOpenAI(token='API-KEY')
resp = await openai_client.send_message('Your request?', Model().chat())
print(resp)
if __name__ == '__main__':
asyncio.run(chat())
- Example #2 Movie to Emoji:
import asyncio
from ai_openchat import AsyncOpenAI
from ai_openchat.base.model import Model
async def movie_to_emoji():
openai_client = AsyncOpenAI(token='API-KEY')
resp = await openai_client.send_message('Convert movie titles into emoji.\n\n'
'Back to the Future: 👨👴🚗🕒 \n'
'Batman: 🤵🦇 \n'
'Transformers: 🚗🤖 \n'
'Star Wars:', Model().movie_to_emoji())
print(resp)
# ⭐️⚔️
if __name__ == '__main__':
asyncio.run(movie_to_emoji())
This project is an attempt to make an asynchronous library for convenient OpenAI management. You can check out the rest of the models here: https://beta.openai.com/examples.
I continue to develop it and will soon add image generation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ai_openchat-1.0.2.tar.gz
(4.0 kB
view hashes)