Skip to main content

LocalLab: Run language models locally or in Google Collab with a friendly API

Project description

๐Ÿš€ LocalLab: Your Personal AI Lab

Run powerful AI language models on your own computer or Google Colab - no cloud services needed! Think of it as having ChatGPT-like capabilities right on your machine.

๐Ÿค” What is LocalLab?

LocalLab brings AI to your fingertips with two key components:

graph TD
    A[Your Code] -->|Uses| B[LocalLab Client]
    B -->|Talks to| C[LocalLab Server]
    C -->|Runs| D[AI Models]
    C -->|Manages| E[Memory & Resources]
    C -->|Optimizes| F[Performance]

๐ŸŽฏ Key Features

๐Ÿ“ฆ Easy Setup         ๐Ÿ”’ Privacy First       ๐ŸŽฎ Free GPU Access
๐Ÿค– Multiple Models    ๐Ÿ’พ Memory Efficient    ๐Ÿ”„ Auto-Optimization
๐ŸŒ Local or Colab    โšก Fast Response       ๐Ÿ”ง Simple API

๐ŸŒŸ Two Ways to Run

  1. On Your Computer (Local Mode)

    ๐Ÿ’ป Your Computer
    โ””โ”€โ”€ ๐Ÿš€ LocalLab Server
        โ””โ”€โ”€ ๐Ÿค– AI Model
            โ””โ”€โ”€ ๐Ÿ”ง Auto-optimization
    
  2. On Google Colab (Free GPU Mode)

    โ˜๏ธ Google Colab
    โ””โ”€โ”€ ๐ŸŽฎ Free GPU
        โ””โ”€โ”€ ๐Ÿš€ LocalLab Server
            โ””โ”€โ”€ ๐Ÿค– AI Model
                โ””โ”€โ”€ โšก GPU Acceleration
    

๐Ÿ“ฆ Installation & Setup

1. Install Required Packages

# Install both server and client packages
pip install locallab locallab-client

2. Configure the Server (Recommended)

# Run interactive configuration
locallab config

# This will help you set up:
# - Model selection
# - Memory optimizations
# - GPU settings
# - System resources

3. Start the Server

# Start with saved configuration
locallab start

# Or start with specific options
locallab start --model microsoft/phi-2 --quantize --quantize-type int8

๐Ÿ’ก Basic Usage

Synchronous Usage (Easier for Beginners)

from locallab_client import SyncLocalLabClient

# Connect to server
client = SyncLocalLabClient("http://localhost:8000")

try:
    # Generate text
    response = client.generate("Write a story")
    print(response)

    # Chat with AI
    response = client.chat([
        {"role": "system", "content": "You are helpful."},
        {"role": "user", "content": "Hello!"}
    ])
    print(response.choices[0]["message"]["content"])

finally:
    # Always close the client
    client.close()

Asynchronous Usage (For Advanced Users)

import asyncio
from locallab_client import LocalLabClient

async def main():
    # Connect to server
    client = LocalLabClient("http://localhost:8000")
    
    try:
        # Generate text
        response = await client.generate("Write a story")
        print(response)

        # Stream responses
        async for token in client.stream_generate("Tell me a story"):
            print(token, end="", flush=True)

        # Chat with AI
        response = await client.chat([
            {"role": "system", "content": "You are helpful."},
            {"role": "user", "content": "Hello!"}
        ])
        print(response.choices[0]["message"]["content"])

    finally:
        # Always close the client
        await client.close()

# Run the async function
asyncio.run(main())

๐ŸŒ Google Colab Usage

Run LocalLab on Google's free GPUs:

# 1. Install packages
!pip install locallab locallab-client

# 2. Configure with CLI (notice the ! prefix)
!locallab config

# 3. Start server with CLI
!locallab start --use-ngrok

# 4. Connect client (Locally)
from locallab_client import LocalLabClient
client = LocalLabClient("https://your-server-ngrok-url.app")
response = await client.generate("Hello!")

๐Ÿ’ป Requirements

Local Computer

  • Python 3.8+
  • 4GB RAM minimum (8GB+ recommended)
  • GPU optional but recommended
  • Internet connection for downloading models

Google Colab

  • Just a Google account!
  • Free tier works fine

๐ŸŒŸ Features

  • Easy Setup: Just pip install and run
  • Multiple Models: Use any Hugging Face model
  • Resource Efficient: Automatic optimization
  • Privacy First: All local, no data sent to cloud
  • Free GPU: Google Colab integration
  • Flexible Client API: Both async and sync clients available
  • Automatic Resource Management: Sessions close automatically

โžก๏ธ See All Features

๐Ÿ“š Documentation

Getting Started

  1. Installation Guide
  2. Basic Examples
  3. CLI Usage

Advanced Topics

  1. API Reference
  2. Client Libraries
  3. Advanced Features
  4. Performance Guide

Deployment

  1. Local Setup
  2. Google Colab Guide

๐Ÿ” Need Help?

๐Ÿ“– Additional Resources

๐ŸŒŸ Star Us!

If you find LocalLab helpful, please star our repository! It helps others discover the project.


Made with โค๏ธ by Utkarsh Tiwari GitHub โ€ข Twitter โ€ข LinkedIn

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

locallab-0.5.0.tar.gz (56.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

locallab-0.5.0-py3-none-any.whl (60.0 kB view details)

Uploaded Python 3

File details

Details for the file locallab-0.5.0.tar.gz.

File metadata

  • Download URL: locallab-0.5.0.tar.gz
  • Upload date:
  • Size: 56.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for locallab-0.5.0.tar.gz
Algorithm Hash digest
SHA256 ab68644bbf2d3376ba2f97aad7cfd9baab1f90d34ba86bb62a182769a0b91946
MD5 2706d5ec02bcbafd78f34387c0541ca0
BLAKE2b-256 aa373ec674c5fee7cca1f00f6a7e344e3bab79a3c9ba90084d6c18ab13374417

See more details on using hashes here.

File details

Details for the file locallab-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: locallab-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 60.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for locallab-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e994d0486fd516ce855cae725e402b3bfab4b6be7b3b66354fa1e6d14c637cbf
MD5 3e41e502b4be0af0188018ecbc26d3e4
BLAKE2b-256 4635e1aa07b9d9c0713118058ca2d2df2366df03e26ef6d47721f4fb73d75636

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page