LocalLab: Run language models locally or in Google Collab with a friendly API
Project description
๐ LocalLab: Your Personal AI Lab
Run powerful AI language models on your own computer or Google Colab - no cloud services needed! Think of it as having ChatGPT-like capabilities right on your machine.
๐ค What is LocalLab?
LocalLab brings AI to your fingertips with two key components:
graph TD
A[Your Code] -->|Uses| B[LocalLab Client]
B -->|Talks to| C[LocalLab Server]
C -->|Runs| D[AI Models]
C -->|Manages| E[Memory & Resources]
C -->|Optimizes| F[Performance]
๐ฏ Key Features
๐ฆ Easy Setup ๐ Privacy First ๐ฎ Free GPU Access
๐ค Multiple Models ๐พ Memory Efficient ๐ Auto-Optimization
๐ Local or Colab โก Fast Response ๐ง Simple API
๐ Two Ways to Run
-
On Your Computer (Local Mode)
๐ป Your Computer โโโ ๐ LocalLab Server โโโ ๐ค AI Model โโโ ๐ง Auto-optimization -
On Google Colab (Free GPU Mode)
โ๏ธ Google Colab โโโ ๐ฎ Free GPU โโโ ๐ LocalLab Server โโโ ๐ค AI Model โโโ โก GPU Acceleration
๐ฆ Installation & Setup
1. Install Required Packages
# Install both server and client packages
pip install locallab locallab-client
2. Configure the Server (Recommended)
# Run interactive configuration
locallab config
# This will help you set up:
# - Model selection
# - Memory optimizations
# - GPU settings
# - System resources
3. Start the Server
# Start with saved configuration
locallab start
# Or start with specific options
locallab start --model microsoft/phi-2 --quantize --quantize-type int8
๐ก Basic Usage
Synchronous Usage (Easier for Beginners)
from locallab_client import SyncLocalLabClient
# Connect to server
client = SyncLocalLabClient("http://localhost:8000")
try:
# Generate text
response = client.generate("Write a story")
print(response)
# Chat with AI
response = client.chat([
{"role": "system", "content": "You are helpful."},
{"role": "user", "content": "Hello!"}
])
print(response.choices[0]["message"]["content"])
finally:
# Always close the client
client.close()
Asynchronous Usage (For Advanced Users)
import asyncio
from locallab_client import LocalLabClient
async def main():
# Connect to server
client = LocalLabClient("http://localhost:8000")
try:
# Generate text
response = await client.generate("Write a story")
print(response)
# Stream responses
async for token in client.stream_generate("Tell me a story"):
print(token, end="", flush=True)
# Chat with AI
response = await client.chat([
{"role": "system", "content": "You are helpful."},
{"role": "user", "content": "Hello!"}
])
print(response.choices[0]["message"]["content"])
finally:
# Always close the client
await client.close()
# Run the async function
asyncio.run(main())
๐ Google Colab Usage
Run LocalLab on Google's free GPUs:
# 1. Install packages
!pip install locallab locallab-client
# 2. Configure with CLI (notice the ! prefix)
!locallab config
# 3. Start server with CLI
!locallab start --use-ngrok
# 4. Connect client (Locally)
from locallab_client import LocalLabClient
client = LocalLabClient("https://your-server-ngrok-url.app")
response = await client.generate("Hello!")
๐ป Requirements
Local Computer
- Python 3.8+
- 4GB RAM minimum (8GB+ recommended)
- GPU optional but recommended
- Internet connection for downloading models
Google Colab
- Just a Google account!
- Free tier works fine
๐ Features
- Easy Setup: Just pip install and run
- Multiple Models: Use any Hugging Face model
- Resource Efficient: Automatic optimization
- Privacy First: All local, no data sent to cloud
- Free GPU: Google Colab integration
- Flexible Client API: Both async and sync clients available
- Automatic Resource Management: Sessions close automatically
๐ Documentation
Getting Started
Advanced Topics
Deployment
๐ Need Help?
- Check FAQ
- Visit Troubleshooting
- Ask in Discussions
๐ Additional Resources
๐ Star Us!
If you find LocalLab helpful, please star our repository! It helps others discover the project.
Made with โค๏ธ by Utkarsh Tiwari GitHub โข Twitter โข LinkedIn
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file locallab-0.5.0.tar.gz.
File metadata
- Download URL: locallab-0.5.0.tar.gz
- Upload date:
- Size: 56.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ab68644bbf2d3376ba2f97aad7cfd9baab1f90d34ba86bb62a182769a0b91946
|
|
| MD5 |
2706d5ec02bcbafd78f34387c0541ca0
|
|
| BLAKE2b-256 |
aa373ec674c5fee7cca1f00f6a7e344e3bab79a3c9ba90084d6c18ab13374417
|
File details
Details for the file locallab-0.5.0-py3-none-any.whl.
File metadata
- Download URL: locallab-0.5.0-py3-none-any.whl
- Upload date:
- Size: 60.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e994d0486fd516ce855cae725e402b3bfab4b6be7b3b66354fa1e6d14c637cbf
|
|
| MD5 |
3e41e502b4be0af0188018ecbc26d3e4
|
|
| BLAKE2b-256 |
4635e1aa07b9d9c0713118058ca2d2df2366df03e26ef6d47721f4fb73d75636
|