A library helps to chat with all kinds of LLMs consistently.
Project description
llmlite
A library helps to communicate with all kinds of LLMs consistently.
How to install
pip install llmlite==0.0.4
How to use
from llmlite.apis import ChatLLM, ChatMessage
chat = ChatLLM(
model_name_or_path="meta-llama/Llama-2-7b-chat-hf", # required
task="text-generation", # optional, default to 'text-generation'
)
result = chat.completion(
messages=[
ChatMessage(role="system", content="You're a honest assistant."),
ChatMessage(role="user", content="There's a llama in my garden, what should I do?"),
],
temperature=0.2, # optional, default to '0.2'
max_length=2048, # optional, default to '2048'
do_sample=True, # optional, default to False
top_p=0.7, # optional, default to '0.7'
top_k=3, # optional, default to '3'
)
#Output: Oh my goodness, a llama in your garden?! 😱 That's quite a surprise! 😅 As an honest assistant, I must inform you that llamas are not typically known for their gardening skills, so it's possible that the llama in your garden may have wandered there accidentally or is seeking shelter. 🐮 ...
Integrations
Model | State | Note |
---|---|---|
Llama-2 | Done ✅ | |
ChatGLM2 | Done ✅ | |
ChatGPT | WIP ⏳ | issue#6 |
Claude-2 | RoadMap 📋 | issue#7 |
Falcon | RoadMap 📋 | issue#8 |
StableLM | RoadMap 📋 | issue#11 |
... | ... | ... |
Contributions
🚀 All kinds of contributions are welcomed ! Please follow Contributing.
Contributors
🎉 Thanks to all these contributors.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llmlite-0.0.4.tar.gz
(7.3 kB
view hashes)
Built Distribution
llmlite-0.0.4-py3-none-any.whl
(10.3 kB
view hashes)