The general purpose LLM app stacks.
Project description
Languru
The general-purpose LLM app stacks deploy AI services quickly and (stupidly) simply.
Getting Started
Install Languru:
pip install languru
Run agent server:
languru server run
Run llm action server:
languru llm run
Usage
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8680/v1")
res = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"},
],
)
print(res)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
languru-0.1.1.tar.gz
(13.1 kB
view hashes)
Built Distribution
languru-0.1.1-py3-none-any.whl
(19.1 kB
view hashes)