Skip to main content

A CLI assistant powered by local AI (Msty Studio) or OpenAI

Project description

x - Your Smart CLI Assistant

x (install via pip install xcli-ai) is an AI-powered command-line interface assistant that helps you find and execute the right commands for your tasks. It supports local AI (via Msty Studio or MLX) and cloud AI (OpenAI) to convert natural language queries into system commands, with built-in safety confirmations and OS-specific command generation.

It came in a dream. "Thank you for this gift."

Features

  • 🤖 Natural language to CLI command conversion
  • 💡 Command explanations for better understanding
  • ✅ Command confirmation before execution
  • 🔒 Secure API key storage
  • 🏠 Local AI support via Msty Studio - Run completely offline and private
  • 🍎 MLX support - Apple Silicon optimized local AI
  • OpenAI - Hyper Speed Most Efficient (Fastest) - Fastest cloud AI option
  • 💻 OS-specific command generation (macOS, Linux, Windows)
  • 🎨 Rich terminal output formatting
  • ⚙️ Easy configuration with x --configure

Installation

Option 1: Install from PyPI (Recommended)

pip install xcli-ai

Option 2: Install from Source

  1. Clone the repository:
git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Quick Start

After installation, you can use the x command:

Using Local AI (Msty Studio) - Recommended

  1. Install and start Msty Studio

    • Download Msty Studio from: https://msty.ai
    • Launch Msty Studio and ensure it's running locally
    • Msty Studio typically runs on http://localhost:1234/v1
  2. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 1 for "Local AI (Msty Studio)"
    • Enter your Msty Studio API base URL (default: http://localhost:1234/v1)
    • Enter your model name (default: mistral)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files

Using MLX (Apple Silicon) - Optimized for Mac

  1. Install and start MLX server

    • Set up an MLX-compatible server running locally
    • MLX server typically runs on http://localhost:11973/v1
  2. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 2 for "MLX - Apple Silicon optimized local AI"
    • Enter your MLX API base URL (default: http://localhost:11973/v1)
    • Enter your model name (default: mlx-community/Qwen2.5-0.5B-Instruct-4bit)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files

Using OpenAI - Hyper Speed Most Efficient (Fastest)

  1. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 3 for "OpenAI - Hyper Speed Most Efficient (Fastest)"
    • Get your API key from: https://platform.openai.com/api-keys
    • Configuration will be securely stored in ~/.solai.env
  2. Run a command:

x find large files

Example output:

Suggested command:
find ~ -type f -size +100M
→ Searches your home directory for files larger than 100 megabytes

Do you want to execute this command? [y/n]:

Sol Screenshot

Usage Examples

# Find files
x find all pdf files in downloads

# System maintenance
x clean up system cache

# Network commands
x check if google.com is up

# File operations
x create a backup of my documents

# With admin privileges
x --admin install package
x -a update system

Development

To install in development mode:

git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Requirements

  • Python 3.6+
  • For Local AI: Msty Studio installed and running
  • For MLX: MLX server installed and running (Apple Silicon optimized)
  • For OpenAI - Hyper Speed Most Efficient (Fastest): OpenAI API key
  • Required packages:
    • click
    • python-dotenv
    • openai
    • rich

Configuration

Configuration is stored in ~/.solai.env. The setup wizard will guide you through the initial configuration.

Local AI Configuration (Msty Studio)

AI_PROVIDER=local
API_BASE_URL=http://localhost:1234/v1
API_KEY=not-needed
MODEL=mistral

MLX Configuration (Apple Silicon)

AI_PROVIDER=mlx
API_BASE_URL=http://localhost:11973/v1
API_KEY=not-needed
MODEL=mlx-community/Qwen2.5-0.5B-Instruct-4bit

OpenAI - Hyper Speed Most Efficient (Fastest) Configuration

AI_PROVIDER=openai
API_KEY=your-api-key-here
MODEL=gpt-3.5-turbo

To reconfigure, simply delete ~/.solai.env and run any x command to trigger the setup process again, or use x --configure for an interactive configuration menu.

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -am 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License

Contact

Jon Caraveo - caraveo@me.com

Project Link: https://github.com/caraveo/solai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xcli_ai-1.0.3.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xcli_ai-1.0.3-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file xcli_ai-1.0.3.tar.gz.

File metadata

  • Download URL: xcli_ai-1.0.3.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.3.tar.gz
Algorithm Hash digest
SHA256 6a6595b087a58f6c1004c3081b21be6ca317ba9b9a6959aeab367fd52c0a3638
MD5 196cc613c490684a1d486a8950a6763e
BLAKE2b-256 479316e16c76370cb9ee2d15aa26cf36b19b20409cd589a138b19b0ac05d996c

See more details on using hashes here.

File details

Details for the file xcli_ai-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: xcli_ai-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 12.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5e23d3fd6483ae34a818f37090af8ccbe0ce46c94d3134053aea7a6c6a4c0b78
MD5 41c97a585699a22d9c273f2f73e1a57a
BLAKE2b-256 39c3a77f9d8c79529fa1f7a86d427dc3ee332fbe1c2c000dd50396f7887667a8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page