LM Async Client, openai client, azure openai client ...
Project description
lmclient
面向于大规模异步请求 OpenAI 接口设计的客户端,使用场景 self-instruct, 大规模翻译等
Features
- 支持大规模异步请求 openai 接口
- 支持进度条
- 支持限制每分钟最大请求次数
- 支持限制异步容量 (类似于线程池的大小)
- 支持磁盘缓存
- 100% type hints
- 非常易用
- 支持 OpenAI, Azure, Minimax, MinimaxPro, 智谱, 百度文心, 腾讯混元
- 支持 FunctionCall
安装方式
支持 python3.8 及以上
pip install lmclient-core
使用方法
- CompletionEngine
from lmclient import CompletionEngine, OpenAIChat, OpenAIChatParameters
model = OpenAIChat('gpt-3.5-turbo', parameters=OpenAIChatParameters(temperature=0))
# 控制每分钟最大请求次数为 20, 异步容量为 5
client = CompletionEngine(model, async_capacity=5, max_requests_per_minute=20)
prompts = [
'Hello, my name is',
'can you please tell me your name?',
[{'role': 'user', 'content': 'hello, who are you?'}],
'what is your name?',
]
outputs = client.async_run(prompts=prompts)
for output in outputs:
print(output.reply)
- ChatEngine
from lmclient import ChatEngine, OpenAIChat
model = OpenAIChat('gpt-3.5-turbo')
chat_engine = ChatEngine(model)
print(chat_engine.chat('你好,我是 chat_engine'))
print(chat_engine.chat('我上一句话是什么?'))
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lmclient_core-0.8.2.post1.tar.gz
(20.3 kB
view hashes)
Built Distribution
Close
Hashes for lmclient_core-0.8.2.post1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5ed3ab3e2bffe33198633377038a27b5fd24ff46bf570f1e87adfc65115f4754 |
|
MD5 | 7d8aa24cbd7fa2e17e67b035b172567e |
|
BLAKE2b-256 | 5cc331d4e6a1717b0886db66d7e455d688de218d623d8c95ffc9f813aa3e05d0 |
Close
Hashes for lmclient_core-0.8.2.post1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9aba547a4801fcf24b7fb30c8d712f68e89b0a318d037e6e5da9c6ccbed85aca |
|
MD5 | 11b6aca96dd911a53bfdc545372ff231 |
|
BLAKE2b-256 | ff3674619873537c2672503d52a14902730025a5db4767a49068debe7807476f |