LM Async Client, openai client, azure openai client ...
Project description
lmclient
LM Async Client, OpenAI, Azure ...
Install
pip install lmclient-core
Usage
from lmclient import LMClient, AzureCompletion, OpenAICompletion
openai_completion = OpenAICompletion(model='gpt-3.5-turbo')
# azure_completion = AzureCompletion()
client = LMClient(openai_completion, async_capacity=5, max_requests_per_minute=20)
prompts = [
'Hello, my name is',
'can you please tell me your name?',
'i want to know your name',
'what is your name?',
]
values = client.async_run(prompts=prompts)
print(values)
Advanced Usage
# limit max_requests_per_minute to 20
# limit async_capacity to 5 (max 5 async requests at the same time)
# use cache
# set error_mode to ignore (ignore or raise)
from lmclient import LMClient, OpenAICompletion
openai_completion = OpenAICompletion(model='gpt-3.5-turbo', max_requests_per_minute=20, async_capacity=5, cache_dir='openai_cache', error_mode='ignore')
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lmclient-core-0.2.1.tar.gz
(8.1 kB
view hashes)
Built Distribution
Close
Hashes for lmclient_core-0.2.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4216d0a646c39b56c59cfae969245a0494226eceb593c707d26a82a7ccb29086 |
|
MD5 | 38a4a568ffd3b41aef33276923ac127c |
|
BLAKE2b-256 | ac286cd5cc59408eba2733147a90a44d281bd75ec4fecf1780a209eb620bfbbe |