Skip to main content

A lightweight AI inference server for running models locally or in Google Colab

Project description

🚀 LocalLab

Build Status LocalLab Version Python Version License

LocalLab is a powerful, lightweight AI inference server designed to deliver cutting-edge language model capabilities on your local machine or through Google Colab. It empowers developers and researchers to run sophisticated AI models on local hardware, optimizing resources with advanced features such as dynamic model loading, memory optimizations, and real-time system monitoring.

What Problem Does LocalLab Solve?

  • Local Inference: Run advanced language models without relying on expensive cloud services.
  • Optimized Performance: Utilize state-of-the-art techniques like quantization, attention slicing, and CPU offloading for maximum efficiency.
  • Seamless Deployment: Easily switch between local deployment and Google Colab, leveraging ngrok for public accessibility.
  • Effective Resource Management: Automatically monitor and manage CPU, RAM, and GPU usage to ensure smooth operation.

System Requirements

Minimum Requirements

Component Local Deployment Google Colab
RAM 4GB Free tier (12GB)
CPU 2 cores 2 cores
Python 3.8+ 3.8+
Storage 2GB free -
GPU Optional Available in free tier

Recommended Requirements

Component Local Deployment Google Colab
RAM 8GB+ Pro tier (24GB)
CPU 4+ cores Pro tier (4 cores)
Python 3.9+ 3.9+
Storage 5GB+ free -
GPU CUDA-compatible Pro tier GPU

Key Features

  • Multiple Model Support: Pre-configured models along with the ability to load custom ones on demand.
  • Advanced Optimizations: Support for FP16, INT8, and INT4 quantization, Flash Attention, and attention slicing.
  • Robust Resource Monitoring: Real-time insights into system performance and resource usage.
  • Flexible Client Libraries: Comprehensive clients available for both Python and Node.js.
  • Google Colab Friendly: Dedicated workflow for deploying via Google Colab with public URL access.

Unique Visual Overview

Below is a high-level diagram of LocalLab's architecture.

graph TD
    A["User"] --> B["LocalLab Client (Python/Node.js)"]
    B --> C["LocalLab Server"]
    C --> D["Model Manager"]
    D --> E["Hugging Face Models"]
    C --> F["Optimizations"]
    C --> G["Resource Monitoring"]

Google Colab Workflow

sequenceDiagram
    participant U as "User (Colab)"
    participant S as "LocalLab Server"
    participant N as "Ngrok Tunnel"
    U->>S: Run start_server(ngrok=True)
    S->>N: Establish public tunnel
    N->>U: Return public URL
    U->>S: Connect via public URL

Documentation & Usage Guides

For full documentation and detailed guides, please visit our documentation page.

Get Started

  1. Installation:

    pip install locallab
    
  2. Starting the Server Locally:

    from locallab import start_server
    start_server()
    
  3. Starting the Server on Google Colab:

    !pip install locallab
    import os
    os.environ["NGROK_AUTH_TOKEN"] = "your_token_here"
    from locallab import start_server
    start_server(ngrok=True)
    
  4. Connecting your Client:

    from locallab.client import LocalLabClient
    client = LocalLabClient("http://localhost:8000")  # Use ngrok URL for Colab deployment
    

Join the Community


LocalLab is designed to bring the power of advanced language models directly to your workspace—efficiently, flexibly, and affordably. Give it a try and revolutionize your AI projects!

Project details


Release history Release notifications | RSS feed

This version

0.1.8

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

locallab-0.1.8.tar.gz (25.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

locallab-0.1.8-py3-none-any.whl (23.7 kB view details)

Uploaded Python 3

File details

Details for the file locallab-0.1.8.tar.gz.

File metadata

  • Download URL: locallab-0.1.8.tar.gz
  • Upload date:
  • Size: 25.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for locallab-0.1.8.tar.gz
Algorithm Hash digest
SHA256 00295e0338780c2547b9bef7dbdc03f0f5effb6f6b592fb9fd8f4bd15ef9fd6a
MD5 473a50757e71c12d71b83d2cc48f1a52
BLAKE2b-256 b6a78d071716a685b24b51dafeedcbe3ce785e0858279ec440aa71598409a5d8

See more details on using hashes here.

File details

Details for the file locallab-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: locallab-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 23.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for locallab-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 633085cbcc2685699120764f2b05ad048dedae390a839e5c5c289480a6f303b8
MD5 37b9cb2e7089e73bdca31c2d9444e944
BLAKE2b-256 100f5316de6c0505b4204ab18d70c15027d15c5ec8fc6c0f022ecf8b0244c911

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page