Python client for OffGrid LLM - Run AI models completely offline
Project description
OffGrid Python Client
Python client library for OffGrid LLM - Run AI models completely offline.
Installation
pip install offgrid
Quick Start
import offgrid
# Connect to server
client = offgrid.Client() # localhost:11611
# Chat
response = client.chat("What is Python?")
print(response)
# List available models
models = client.list_models()
for m in models:
print(f"- {m['id']}")
Full Usage
Chat
from offgrid import Client
client = Client()
# Basic chat (uses first available model)
response = client.chat("Explain quantum computing")
print(response)
# Specify model
response = client.chat("Hello!", model="Llama-3.2-3B-Instruct")
# With system prompt
response = client.chat(
"Write a poem about AI",
model="Llama-3.2-3B-Instruct",
system="You are a creative poet.",
temperature=0.9
)
# Streaming
for chunk in client.chat("Tell me a long story", stream=True):
print(chunk, end="", flush=True)
# Full conversation
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"},
{"role": "assistant", "content": "Hi there! How can I help?"},
{"role": "user", "content": "What's the weather like?"}
]
response = client.chat(messages=messages)
Model Management
# List installed models
for model in client.list_models():
print(model['id'])
# Search for models
results = client.models.search("llama", ram=8)
for model in results:
print(f"{model['id']} - {model['size_gb']}GB")
# Download a model
client.models.download(
"bartowski/Llama-3.2-3B-Instruct-GGUF",
"Llama-3.2-3B-Instruct-Q4_K_M.gguf",
progress_callback=lambda pct, done, total: print(f"\r{pct:.1f}%", end="")
)
# Delete a model
client.models.delete("old-model")
# Import from USB
imported = client.models.import_usb("/media/usb")
# Export to USB
client.models.export_usb("Llama-3.2-3B-Instruct-Q4_K_M", "/media/usb")
Knowledge Base (RAG)
# Add documents
client.kb.add("notes.md")
client.kb.add("meeting", content="Meeting notes from today...")
client.kb.add_directory("./docs", extensions=[".md", ".txt"])
# List documents
for doc in client.kb.list():
print(f"{doc['id']}: {doc['chunks']} chunks")
# Search
results = client.kb.search("project deadline")
for r in results:
print(f"[{r['score']:.2f}] {r['content'][:100]}...")
# Chat with Knowledge Base context
response = client.chat(
"What are the main action items from the meeting?",
use_kb=True
)
# Remove documents
client.kb.remove("notes.md")
client.kb.clear() # Remove all
Embeddings
# Single text
embedding = client.embed("Hello world")
print(f"Dimensions: {len(embedding)}")
# Multiple texts
embeddings = client.embed(["Hello", "World", "AI"])
System Info
# Check server health
if client.health():
print("Server is running")
# Get detailed info
info = client.info()
print(f"Uptime: {info['uptime']}")
print(f"CPU: {info['system']['cpu_percent']}%")
print(f"Memory: {info['system']['memory_percent']}%")
Configuration
from offgrid import Client
# Default: localhost:11611
client = Client()
# Custom server URL
client = Client(host="http://192.168.1.100:11611")
# Just hostname (auto-adds http://)
client = Client(host="192.168.1.100:11611")
# Custom timeout (for slow models)
client = Client(timeout=600) # 10 minutes
Error Handling
from offgrid import Client, OffGridError
client = Client()
try:
response = client.chat("Hello")
except OffGridError as e:
print(f"Error: {e.message}")
if e.code:
print(f"Code: {e.code}")
Requirements
- Python 3.8+
- OffGrid LLM server running (
offgrid serve) - No external dependencies (uses only stdlib)
Links
- OffGrid LLM - Main project
- API Reference
- Issue Tracker
License
MIT License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
offgrid-0.1.1.tar.gz
(15.0 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
offgrid-0.1.1-py3-none-any.whl
(11.5 kB
view details)
File details
Details for the file offgrid-0.1.1.tar.gz.
File metadata
- Download URL: offgrid-0.1.1.tar.gz
- Upload date:
- Size: 15.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.8.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b427183878dac0ce3c6708516646bf82b0330a933ffcc265a3d6d8595a5a8199
|
|
| MD5 |
1ea4706d667ddf3a243dc4c557ebd000
|
|
| BLAKE2b-256 |
b03d363d3605a19a09cb9c68c8308ec49498fabe7e4ce676dfdfe5246889c8a4
|
File details
Details for the file offgrid-0.1.1-py3-none-any.whl.
File metadata
- Download URL: offgrid-0.1.1-py3-none-any.whl
- Upload date:
- Size: 11.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.8.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f44e7549f713ca2423651721d2fdb86c28347bf0c9e2bf4db3f88c83df0278c
|
|
| MD5 |
fb6637857c7b6dec3fbfd88762671650
|
|
| BLAKE2b-256 |
ee2b18b1536e03b7c2517e8520f942d488d2d044395982e2c9b6cf90c200d876
|