Skip to main content

The general purpose LLM app stacks.

Project description

Languru

The general-purpose LLM app stacks deploy AI services quickly and (stupidly) simply.

 _
| |    __ _ _ __   __ _ _   _ _ __ _   _
| |   / _` | '_ \ / _` | | | | '__| | | |
| |__| (_| | | | | (_| | |_| | |  | |_| |
|_____\__,_|_| |_|\__, |\__,_|_|   \__,_|
                  |___/

image image image PytestCI codecov

Documentation: Github Pages

Install Languru

pip install languru

# Install For LLM deployment.
pip install languru[all]

# Install development dependencies.
poetry install -E <extras> --with dev

# Or just install all dependencies.
poetry install -E all --with dev --with docs

OpenAI Clients

Supported OpenAI clients:

  • openai.OpenAI
  • openai.AzureOpenAI
  • languru.openai_plugins.clients.anthropic.AnthropicOpenAI
  • languru.openai_plugins.clients.google.GoogleOpenAI
  • languru.openai_plugins.clients.groq.GroqOpenAI
  • languru.openai_plugins.clients.pplx.PerplexityOpenAI
  • languru.openai_plugins.clients.voyage.VoyageOpenAI

OpenAI Server

languru server run  # Remember set all needed `api-key` for OpenAI clients.

Query LLM service, which is fully compatible with OpenAI APIs.

from openai import OpenAI

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
)
for choice in res.choices:
    print(f"{choice.message.role}: {choice.message.content}")
# assistant: Hello! How can I assist you today?

Chat streaming:

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    stream=True,
)
for chunk in res:
    for choice in chunk.choices:
        if choice.delta.content:
            print(choice.delta.content, end="", flush=True)
            # Hello! How can I assist you today?

OpenAI plugins clients:

client = OpenAI(base_url="http://localhost:8682/v1")
res = client.chat.completions.create(
    model="google/gemini-1.5-flash",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    stream=True,
)
for choice in res.choices:
    print(f"{choice.message.role}: {choice.message.content}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

languru-0.22.0.tar.gz (119.6 kB view details)

Uploaded Source

Built Distribution

languru-0.22.0-py3-none-any.whl (171.0 kB view details)

Uploaded Python 3

File details

Details for the file languru-0.22.0.tar.gz.

File metadata

  • Download URL: languru-0.22.0.tar.gz
  • Upload date:
  • Size: 119.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.10.13 Darwin/24.0.0

File hashes

Hashes for languru-0.22.0.tar.gz
Algorithm Hash digest
SHA256 b103223e8daa42344044d85b6ffce703cdaad17af50d3ae84f3304242c1fd583
MD5 bebbce155f3c2cc01e37dd0d7102ff99
BLAKE2b-256 ae02f500a36e7c476a1a8afeca4800a5f0e89bf6fe6ff24c6106450c9f16e46c

See more details on using hashes here.

File details

Details for the file languru-0.22.0-py3-none-any.whl.

File metadata

  • Download URL: languru-0.22.0-py3-none-any.whl
  • Upload date:
  • Size: 171.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.10.13 Darwin/24.0.0

File hashes

Hashes for languru-0.22.0-py3-none-any.whl
Algorithm Hash digest
SHA256 00d7a9c3df4c6f3bcdfaa53a6c275015d1f71af086e5610192acfb7f61992396
MD5 37c69b4ff41358e6d0e18bccbb7195dc
BLAKE2b-256 0939720a8c3880209f1320a612aeb2bd426780c4988b91e41e57deda897354ff

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page