Skip to main content

AI-powered terminal shell with real-time autocomplete and multi-model debate engine

Project description

lac-cli

lac-cli is a terminal shell built by lacai.io that brings AI directly into your command line. It autocompletes what you are typing in real time and understands plain English so you can describe what you want to do instead of memorizing commands.

New in v0.2.0: LacMind multi-model debate engine and GenDoc AI-powered API documentation generator.

Install

pip install lac-cli

Getting Started

Terminal Shell

Run lac to launch the shell. The first time you run it, a setup wizard will walk you through picking your AI provider and entering your API key. After that it goes straight to the shell every time.

lac

To redo the setup at any time:

lac --setup

To run without an internet connection or server:

lac --offline

To adjust autocomplete speed (default 150ms):

lac --debounce 50  # faster
lac --debounce 300 # slower

LacMind Multi-Model Debate

Launch the web-based debate interface for complex queries:

lac mind

LacMind runs multiple AI models in a debate format where they challenge and refine each other's ideas, then vote on the best response. Perfect for research, code generation, and complex problem-solving.

GenDoc API Documentation

Generate beautiful API documentation from your codebase:

lac gendoc /path/to/project

GenDoc scans your project, detects the framework, and uses AI to analyze your routes and controllers to generate comprehensive API documentation. Supports Laravel, Django, FastAPI, Flask, Express, and Rails.

Optional flags:

lac gendoc /path/to/project --prompt "Focus on authentication endpoints"
lac gendoc /path/to/project --integrate  # Add /docs route to your framework
lac gendoc /path/to/project --output custom.html

How It Works

Terminal Shell

When you launch lac, it automatically starts a local server in the background that handles communication with your AI model. You do not need to start anything manually.

As you type, the shell sends your input to the AI and shows a suggested completion as ghost text. Press Tab to accept it. If you type something in plain English like "show all files bigger than 100mb", the shell converts it to the right command and asks you to confirm before running it.

The shell now tracks your session history (commands + outputs) and passes it to the AI for smarter context-aware suggestions.

LacMind Debate Engine

LacMind orchestrates multiple AI models in a structured debate:

  1. Debate Rounds - Models discuss and challenge each other's ideas sequentially
  2. Voting Phase - Models vote on who provided the best reasoning
  3. Consensus Summary - The winning model delivers the final response

Models are labeled anonymously (Model A, B, C) during debate to prevent bias. You can configure debate duration and select which models participate.

GenDoc Documentation Generator

GenDoc uses AI to automatically generate API documentation:

  1. Project Scanning - Detects framework and finds route/controller files
  2. AI Analysis - Sends files to lacai.io backend for intelligent analysis
  3. HTML Generation - Creates beautiful, interactive documentation
  4. Framework Integration - Optionally adds /docs route to your app

The first time you run lac gendoc, you'll be prompted for an API key from lacai.io/dashboard/keys. Documentation includes endpoint details, parameters, request/response examples, and export to Swagger/Postman formats.

Supported Providers

Provider Notes
claude Anthropic API
openai OpenAI API
ollama Local models, no API key needed
custom Any OpenAI compatible endpoint

Commands

Terminal Shell

Command What it does
exit Quit the shell
logout Delete your config and start fresh
clear Clear the screen and session history
cd <path> Change directory

LacMind

Command What it does
lac mind Launch LacMind web interface
Settings page Add/edit/delete AI models
Duration selector Set debate time (30s - 5min)
Stop button End debate early
Export Save conversations to PDF

GenDoc

Command What it does
lac gendoc <path> Generate API docs from project
--prompt "text" Custom instructions for AI analysis
--integrate Add /docs route to framework
--output file.html Custom output filename
Export buttons Download as Swagger or Postman

Config

Your config is saved at ~/.lac/config.json after setup. You can edit it directly if needed.

{
  "provider": "claude",
  "api_key": "sk-...",
  "model": "claude-haiku-4-5-20251001",
  "base_url": "https://api.anthropic.com",
  "server": "ws://localhost:8765"
}

Features

Terminal Shell

  • Ghost text autocomplete as you type, powered by your AI model
  • Plain English to shell command conversion with confirmation before running
  • Session history tracking - AI sees your recent commands and outputs for better context
  • Configurable autocomplete debounce delay (--debounce flag)
  • Works with any major AI provider or local models via Ollama
  • Offline mode falls back to history and static completions
  • Local server starts automatically in the background, no manual setup needed
  • Logout clears your credentials and resets the config

LacMind

  • Multi-model debate engine with sequential discussion rounds
  • Anonymous voting system to select best reasoning
  • Real-time streaming of debate progress
  • Conversation history with chat persistence
  • Model management (add/edit/delete models)
  • Configurable debate duration
  • Stop debate early if consensus is reached
  • Export conversations to PDF
  • Clean, minimal dark theme UI

GenDoc

  • Automatic framework detection (Laravel, Django, FastAPI, Flask, Express, Rails)
  • AI-powered endpoint analysis and documentation generation
  • Interactive HTML documentation with search and navigation
  • Resizable sidebar with endpoint filtering
  • Try It feature to test endpoints directly from docs
  • Export to Swagger (OpenAPI 3.0) and Postman Collection formats
  • Optional framework integration to serve docs at /docs route
  • Custom prompts to guide AI analysis
  • Separate API key system from lac shell (stored in ~/.lac/gendoc.json)
  • Credit-based usage with backend session management

About

lac-cli is part of lacai.io. Built for developers who live in the terminal.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lac_cli-0.2.1.tar.gz (174.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lac_cli-0.2.1-py3-none-any.whl (61.2 kB view details)

Uploaded Python 3

File details

Details for the file lac_cli-0.2.1.tar.gz.

File metadata

  • Download URL: lac_cli-0.2.1.tar.gz
  • Upload date:
  • Size: 174.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for lac_cli-0.2.1.tar.gz
Algorithm Hash digest
SHA256 dfb23cf3bf77b83081ff1a4ea6894bd63c75691d61f912d44e202b1a17046740
MD5 9448dd5db82c267e9c8cac1508c6c603
BLAKE2b-256 ba55d144ed23d25e642c40c901ce81c483502ccc7de187a7135a9d1d057ea16b

See more details on using hashes here.

File details

Details for the file lac_cli-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: lac_cli-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 61.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for lac_cli-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e61fd8f8b5c01f00152f8c350fc5e42eaf788232329069c469688563a0220331
MD5 68e35f2e67104f7f9b60ed6de0d0b54b
BLAKE2b-256 4ecd67af62ba9b4f4e9283be94300364179f7b700917b61bcb0164c344128527

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page