Skip to main content

AI-powered terminal shell with real-time autocomplete and multi-model debate engine

Project description

lac-cli

lac-cli is a terminal shell built by lacai.io that brings AI directly into your command line. It autocompletes what you are typing in real time and understands plain English so you can describe what you want to do instead of memorizing commands.

New in v0.2.0: LacMind multi-model debate engine for complex queries and research tasks.

Install

pip install lac-cli

Getting Started

Terminal Shell

Run lac to launch the shell. The first time you run it, a setup wizard will walk you through picking your AI provider and entering your API key. After that it goes straight to the shell every time.

lac

To redo the setup at any time:

lac --setup

To run without an internet connection or server:

lac --offline

To adjust autocomplete speed (default 150ms):

lac --debounce 50  # faster
lac --debounce 300 # slower

LacMind Multi-Model Debate

Launch the web-based debate interface for complex queries:

lac mind

LacMind runs multiple AI models in a debate format where they challenge and refine each other's ideas, then vote on the best response. Perfect for research, code generation, and complex problem-solving.

How It Works

Terminal Shell

When you launch lac, it automatically starts a local server in the background that handles communication with your AI model. You do not need to start anything manually.

As you type, the shell sends your input to the AI and shows a suggested completion as ghost text. Press Tab to accept it. If you type something in plain English like "show all files bigger than 100mb", the shell converts it to the right command and asks you to confirm before running it.

The shell now tracks your session history (commands + outputs) and passes it to the AI for smarter context-aware suggestions.

LacMind Debate Engine

LacMind orchestrates multiple AI models in a structured debate:

  1. Debate Rounds - Models discuss and challenge each other's ideas sequentially
  2. Voting Phase - Models vote on who provided the best reasoning
  3. Consensus Summary - The winning model delivers the final response

Models are labeled anonymously (Model A, B, C) during debate to prevent bias. You can configure debate duration and select which models participate.

Supported Providers

Provider Notes
claude Anthropic API
openai OpenAI API
ollama Local models, no API key needed
custom Any OpenAI compatible endpoint

Commands

Terminal Shell

Command What it does
exit Quit the shell
logout Delete your config and start fresh
clear Clear the screen and session history
cd <path> Change directory

LacMind

Command What it does
lac mind Launch LacMind web interface
Settings page Add/edit/delete AI models
Duration selector Set debate time (30s - 5min)
Stop button End debate early
Export Save conversations to PDF

Config

Your config is saved at ~/.lac/config.json after setup. You can edit it directly if needed.

{
  "provider": "claude",
  "api_key": "sk-...",
  "model": "claude-haiku-4-5-20251001",
  "base_url": "https://api.anthropic.com",
  "server": "ws://localhost:8765"
}

Features

Terminal Shell

  • Ghost text autocomplete as you type, powered by your AI model
  • Plain English to shell command conversion with confirmation before running
  • Session history tracking - AI sees your recent commands and outputs for better context
  • Configurable autocomplete debounce delay (--debounce flag)
  • Works with any major AI provider or local models via Ollama
  • Offline mode falls back to history and static completions
  • Local server starts automatically in the background, no manual setup needed
  • Logout clears your credentials and resets the config

LacMind

  • Multi-model debate engine with sequential discussion rounds
  • Anonymous voting system to select best reasoning
  • Real-time streaming of debate progress
  • Conversation history with chat persistence
  • Model management (add/edit/delete models)
  • Configurable debate duration
  • Stop debate early if consensus is reached
  • Export conversations to PDF
  • Clean, minimal dark theme UI

About

lac-cli is part of lacai.io. Built for developers who live in the terminal.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lac_cli-0.2.0.tar.gz (33.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lac_cli-0.2.0-py3-none-any.whl (38.5 kB view details)

Uploaded Python 3

File details

Details for the file lac_cli-0.2.0.tar.gz.

File metadata

  • Download URL: lac_cli-0.2.0.tar.gz
  • Upload date:
  • Size: 33.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for lac_cli-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a6aef2638d4868afbcccf263741e6a24198d32fa90d3626127020153f4e52eb2
MD5 71b14379e5bb7e492723e66f5e5a5fb0
BLAKE2b-256 6e680e581f572914471fe8e75772cffe71d2683515314ca1168bcd2f18883621

See more details on using hashes here.

File details

Details for the file lac_cli-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: lac_cli-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 38.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for lac_cli-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1606bd37dd0a744f141cfe85cae0d0d3bce818072523f373fbc6afa3e6643426
MD5 c98cfbff1d7ae901609035cf759b7132
BLAKE2b-256 414126f3200608055d3fdc1454bc6f7be45ca4757f19df0b111b005442c39b44

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page