Skip to main content

A CLI assistant powered by local AI (Msty Studio) or OpenAI

Project description

x - Your Smart CLI Assistant

x (install via pip install xcli-ai) is an AI-powered command-line interface assistant that helps you find and execute the right commands for your tasks. It supports local AI (via Msty Studio or MLX) and cloud AI (OpenAI) to convert natural language queries into system commands, with built-in safety confirmations and OS-specific command generation.

It came in a dream. "Thank you for this gift."

Features

  • 🤖 Natural language to CLI command conversion
  • 💡 Command explanations for better understanding
  • ✅ Command confirmation before execution
  • 🔒 Secure API key storage
  • 🏠 Local AI support via Msty Studio - Run completely offline and private
  • 🍎 MLX support - Apple Silicon optimized local AI
  • OpenAI - Hyper Speed Most Efficient (Fastest) - Fastest cloud AI option
  • 💻 OS-specific command generation (macOS, Linux, Windows)
  • 🎨 Rich terminal output formatting
  • ⚙️ Easy configuration with x --configure

Installation

Option 1: Install from PyPI (Recommended)

pip install xcli-ai

Option 2: Install from Source

  1. Clone the repository:
git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Quick Start

After installation, you can use the x command:

Using Local AI (Msty Studio) - Recommended

  1. Install and start Msty Studio

    • Download Msty Studio from: https://msty.ai
    • Launch Msty Studio and ensure it's running locally
    • Msty Studio typically runs on http://localhost:1234/v1
  2. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 1 for "Local AI (Msty Studio)"
    • Enter your Msty Studio API base URL (default: http://localhost:1234/v1)
    • Enter your model name (default: mistral)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files

Using MLX (Apple Silicon) - Optimized for Mac

  1. Install and start MLX server

    • Set up an MLX-compatible server running locally
    • MLX server typically runs on http://localhost:11973/v1
  2. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 2 for "MLX - Apple Silicon optimized local AI"
    • Enter your MLX API base URL (default: http://localhost:11973/v1)
    • Enter your model name (default: mlx-community/Qwen2.5-0.5B-Instruct-4bit)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files

Using OpenAI - Hyper Speed Most Efficient (Fastest)

  1. First-time setup

    • Run any x command to trigger the setup wizard
    • Choose option 3 for "OpenAI - Hyper Speed Most Efficient (Fastest)"
    • Get your API key from: https://platform.openai.com/api-keys
    • Configuration will be securely stored in ~/.solai.env
  2. Run a command:

x find large files

Example output:

Suggested command:
find ~ -type f -size +100M
→ Searches your home directory for files larger than 100 megabytes

Do you want to execute this command? [y/n]:

Sol Screenshot

Usage Examples

# Find files
x find all pdf files in downloads

# System maintenance
x clean up system cache

# Network commands
x check if google.com is up

# File operations
x create a backup of my documents

# With admin privileges
x --admin install package
x -a update system

Development

To install in development mode:

git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Requirements

  • Python 3.6+
  • For Local AI: Msty Studio installed and running
  • For MLX: MLX server installed and running (Apple Silicon optimized)
  • For OpenAI - Hyper Speed Most Efficient (Fastest): OpenAI API key
  • Required packages:
    • click
    • python-dotenv
    • openai
    • rich

Configuration

Configuration is stored in ~/.solai.env. The setup wizard will guide you through the initial configuration.

Local AI Configuration (Msty Studio)

AI_PROVIDER=local
API_BASE_URL=http://localhost:1234/v1
API_KEY=not-needed
MODEL=mistral

MLX Configuration (Apple Silicon)

AI_PROVIDER=mlx
API_BASE_URL=http://localhost:11973/v1
API_KEY=not-needed
MODEL=mlx-community/Qwen2.5-0.5B-Instruct-4bit

OpenAI - Hyper Speed Most Efficient (Fastest) Configuration

AI_PROVIDER=openai
API_KEY=your-api-key-here
MODEL=gpt-3.5-turbo

To reconfigure, simply delete ~/.solai.env and run any x command to trigger the setup process again, or use x --configure for an interactive configuration menu.

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -am 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License

Contact

Jon Caraveo - caraveo@me.com

Project Link: https://github.com/caraveo/solai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xcli_ai-1.0.1.tar.gz (13.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xcli_ai-1.0.1-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file xcli_ai-1.0.1.tar.gz.

File metadata

  • Download URL: xcli_ai-1.0.1.tar.gz
  • Upload date:
  • Size: 13.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.1.tar.gz
Algorithm Hash digest
SHA256 acc40b646ead344c020eff9ac9db1db663e07e386d20feb18d6d1a6484f39889
MD5 8b9584326810b16397acb22989be7e4f
BLAKE2b-256 a0b685f909196f3df0f591b7139e41844c7e9b3b319024313e8b45b93b43270c

See more details on using hashes here.

File details

Details for the file xcli_ai-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: xcli_ai-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4d37ddb2795f592978750e308d27b8424ab309a06ca74b28c4ac38d95d72e537
MD5 2733cd56b286de7fd26f22c282e226cc
BLAKE2b-256 ae53f6c3a84c3f7a47c31727c73be45f2f24290d497ebd32bdc5082bb634f555

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page