Python client for OffGrid LLM - Run AI models completely offline
Project description
OffGrid Python Client
Python client for OffGrid LLM - Run AI models completely offline.
Installation
pip install offgrid
Quick Start
import offgrid
client = offgrid.Client()
response = client.chat("Hello!")
print(response)
Usage
Connect to Server
import offgrid
# Default: localhost:11611
client = offgrid.Client()
# Custom server
client = offgrid.Client(host="http://192.168.1.100:11611")
# With timeout
client = offgrid.Client(host="http://192.168.1.100:11611", timeout=600)
Chat
# Basic
response = client.chat("What is Python?")
# With system prompt
response = client.chat(
"Write a poem",
system="You are a poet",
temperature=0.9
)
# Streaming
for chunk in client.chat("Tell me a story", stream=True):
print(chunk, end="", flush=True)
# Full conversation
messages = [
{"role": "system", "content": "You are helpful."},
{"role": "user", "content": "Hello!"},
]
response = client.chat(messages=messages)
Model Management
# List installed models
for model in client.list_models():
print(model["id"])
# Search HuggingFace
results = client.models.search("llama", ram=8)
for r in results:
print(f"{r['id']} - {r['size_gb']}GB")
# Download
client.models.download(
"bartowski/Llama-3.2-3B-Instruct-GGUF",
"Llama-3.2-3B-Instruct-Q4_K_M.gguf"
)
# Delete
client.models.delete("old-model")
# USB import/export
client.models.import_usb("/media/usb")
client.models.export_usb("model-name", "/media/usb")
Knowledge Base (RAG)
# Add documents
client.kb.add("notes.txt")
client.kb.add("meeting", content="Meeting notes...")
client.kb.add_directory("./docs")
# List
for doc in client.kb.list():
print(f"{doc['id']}: {doc['chunks']} chunks")
# Search
results = client.kb.search("deadline")
# Chat with context
response = client.chat("Summarize notes", use_kb=True)
# Remove
client.kb.remove("notes.txt")
client.kb.clear()
Embeddings
# Single
embedding = client.embed("Hello world")
# Batch
embeddings = client.embed(["Hello", "World"])
System Info
# Health check
if client.health():
print("Server running")
# Detailed info
info = client.info()
print(f"Uptime: {info['uptime']}")
Error Handling
from offgrid import Client, OffGridError
client = Client()
try:
response = client.chat("Hello")
except OffGridError as e:
print(f"Error: {e.message}")
Requirements
- Python 3.8+
- OffGrid server running (
offgrid serve) - No external dependencies
Links
License
MIT License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
offgrid-0.1.0.tar.gz
(12.3 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
offgrid-0.1.0-py3-none-any.whl
(11.0 kB
view details)
File details
Details for the file offgrid-0.1.0.tar.gz.
File metadata
- Download URL: offgrid-0.1.0.tar.gz
- Upload date:
- Size: 12.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.8.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1ea7dbb58e200ad88fea2439ac6f5fcfbdebeb1a2a1b4acf1293012f13516145
|
|
| MD5 |
8245a9db897e985cdd585b56928ce678
|
|
| BLAKE2b-256 |
793754fd32410b6c9852d0cf4b63bb12df207d61bd735e06003db40abdfc47de
|
File details
Details for the file offgrid-0.1.0-py3-none-any.whl.
File metadata
- Download URL: offgrid-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.8.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6ac636c6a4f5110c926d3b45951590c7c92f456815b9236bce5b57d83e888e18
|
|
| MD5 |
03ac454575560fb43caf0bc801d74ac5
|
|
| BLAKE2b-256 |
a71ff39d3e1b262ad195fbb92794fd08adb92e171644dfc17bef1040af7b62ca
|