All popular Framework HF integration kit
Project description
Litegen
Litegen is a lightweight Python wrapper for managing LLM interactions, supporting both local Ollama models and OpenAI API. It provides a simple, unified interface for chat completions with streaming capabilities.
Installation
pip install litegen
Features
- 🚀 Simple unified interface for LLM interactions
- 🤖 Support for both local Ollama models and OpenAI
- 📡 Built-in streaming capabilities
- 🛠 Function calling support
- 🔄 Context management for conversations
- 🎯 GPU support for enhanced performance
Quick Start
from litegen import completion, pp_completion
# Simple completion
response = completion(
model="mistral", # or any Ollama/OpenAI model
prompt="What is the capital of France?"
)
print(response.choices[0].message.content)
# Streaming completion with pretty print
pp_completion(
model="llama2",
prompt="Write a short story about a robot",
temperature=0.7
)
Advanced Usage
System Prompts and Context
response = completion(
model="mistral",
system_prompt="You are a helpful math tutor",
prompt="Explain the Pythagorean theorem",
context=[
{"role": "user", "content": "Can you help me with math?"},
{"role": "assistant", "content": "Of course! What would you like to know?"}
]
)
Function Calling
def get_weather(location: str, unit: str = "celsius"):
"""Get weather for a location"""
pass
response = completion(
model="gpt-3.5-turbo",
prompt="What's the weather in Paris?",
tools=[get_weather]
)
GPU Support
response = completion(
model="mistral",
prompt="Complex calculation task",
gpu=True # Enable GPU acceleration
)
Configuration
The client can be configured with custom settings:
from litegen import get_client
client = get_client(gpu=True) # Enable GPU support
# Use client directly for more control
Environment Variables
OPENAI_API_KEY: Your OpenAI API key (optional, for OpenAI models)
API Reference
completion(...)
Main function for chat completions.
completion(
model: str, # Model name
messages: Optional[List[Dict[str, str]]] | str = None, # Raw messages or prompt string
system_prompt: str = "You are helpful Assistant", # System prompt
prompt: str = "", # User prompt
context: Optional[List[Dict[str, str]]] = None, # Conversation history
temperature: Optional[float] = None, # Temperature for response randomness
max_tokens: Optional[int] = None, # Max tokens in response
stream: bool = False, # Enable streaming
stop: Optional[List[str]] = None, # Stop sequences
tools: Optional[List] = None, # Function calling tools
gpu: bool = False, # Enable GPU
**kwargs # Additional parameters
)
pp_completion(...)
Streaming-enabled completion with pretty printing. Takes the same parameters as completion().
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file litegen-0.0.60.tar.gz.
File metadata
- Download URL: litegen-0.0.60.tar.gz
- Upload date:
- Size: 16.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-52-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5410691a5ef3c08a14ec5092520370acd5f68ef4072e3e71bfb08ae5d4fdcd41
|
|
| MD5 |
f3c67895d25dd2bafe55e6cca61ba280
|
|
| BLAKE2b-256 |
ed3418de48591c28249c30c6bac528d8a1f472b7d2dd50894cd41a563adce366
|
File details
Details for the file litegen-0.0.60-py3-none-any.whl.
File metadata
- Download URL: litegen-0.0.60-py3-none-any.whl
- Upload date:
- Size: 20.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.0rc1 Linux/6.8.0-52-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
354e6be043c036cf7a411d99cb9ce57fce981251240308da7724942f6d6c3c10
|
|
| MD5 |
87c40ab430959aa3ad960fac333f535c
|
|
| BLAKE2b-256 |
1260084cd9366b0de7f883105eb5c19920e09863264a5e94acc962e834e66e2a
|