Skip to main content

The general purpose LLM app stacks.

Project description

Languru

The general-purpose LLM app stacks deploy AI services quickly and (stupidly) simply.

 _
| |    __ _ _ __   __ _ _   _ _ __ _   _
| |   / _` | '_ \ / _` | | | | '__| | | |
| |__| (_| | | | | (_| | |_| | |  | |_| |
|_____\__,_|_| |_|\__, |\__,_|_|   \__,_|
                  |___/

image image image PytestCI codecov

Documentation: Github Pages

Install Languru

pip install languru

# Install For LLM deployment.
pip install languru[all]

# Install development dependencies.
poetry install -E <extras> --with dev

# Or just install all dependencies.
poetry install -E all --with dev --with docs

OpenAI Clients

Supported OpenAI clients:

  • openai.OpenAI
  • openai.AzureOpenAI
  • languru.openai_plugins.clients.anthropic.AnthropicOpenAI
  • languru.openai_plugins.clients.google.GoogleOpenAI
  • languru.openai_plugins.clients.groq.GroqOpenAI
  • languru.openai_plugins.clients.pplx.PerplexityOpenAI
  • languru.openai_plugins.clients.voyage.VoyageOpenAI

OpenAI Server

languru server run  # Remember set all needed `api-key` for OpenAI clients.

Query LLM service, which is fully compatible with OpenAI APIs.

from openai import OpenAI

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)
for choice in res.choices:
    print(f"{choice.message.role}: {choice.message.content}")
# assistant: Hello! How can I assist you today?

Chat streaming:

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    stream=True,
)
for chunk in res:
    for choice in chunk.choices:
        if choice.delta.content:
            print(choice.delta.content, end="", flush=True)
            # Hello! How can I assist you today?

OpenAI plugins clients:

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="google/gemini-1.5-flash",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    stream=True,
)
for choice in res.choices:
    print(f"{choice.message.role}: {choice.message.content}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

languru-0.21.0.tar.gz (77.4 kB view hashes)

Uploaded Source

Built Distribution

languru-0.21.0-py3-none-any.whl (116.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page