The general purpose LLM app stacks.
Project description
Languru
The general-purpose LLM app stacks deploy AI services quickly and (stupidly) simply.
_
| | __ _ _ __ __ _ _ _ _ __ _ _
| | / _` | '_ \ / _` | | | | '__| | | |
| |__| (_| | | | | (_| | |_| | | | |_| |
|_____\__,_|_| |_|\__, |\__,_|_| \__,_|
|___/
Documentation: Github Pages
Install Languru
pip install languru
# Install For LLM deployment.
pip install languru[all]
# Install development dependencies.
poetry install -E <extras> --with dev
# Or just install all dependencies.
poetry install -E all --with dev --with docs
OpenAI Clients
Supported OpenAI clients:
openai.OpenAI
openai.AzureOpenAI
languru.openai_plugins.clients.anthropic.AnthropicOpenAI
languru.openai_plugins.clients.google.GoogleOpenAI
languru.openai_plugins.clients.groq.GroqOpenAI
languru.openai_plugins.clients.pplx.PerplexityOpenAI
languru.openai_plugins.clients.voyage.VoyageOpenAI
OpenAI Server
languru server run # Remember set all needed `api-key` for OpenAI clients.
Query LLM service, which is fully compatible with OpenAI APIs.
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"},
],
)
for choice in res.choices:
print(f"{choice.message.role}: {choice.message.content}")
# assistant: Hello! How can I assist you today?
Chat streaming:
client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"},
],
stream=True,
)
for chunk in res:
for choice in chunk.choices:
if choice.delta.content:
print(choice.delta.content, end="", flush=True)
# Hello! How can I assist you today?
OpenAI plugins clients:
client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
model="google/gemini-1.5-flash",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"},
],
stream=True,
)
for choice in res.choices:
print(f"{choice.message.role}: {choice.message.content}")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
languru-0.20.2.tar.gz
(51.9 kB
view hashes)
Built Distribution
languru-0.20.2-py3-none-any.whl
(80.3 kB
view hashes)