Skip to main content

A CLI assistant powered by local AI (Msty Studio) or OpenAI

Project description

x - Your Smart CLI Assistant

x is an AI-powered command-line interface assistant that helps you find and execute the right commands for your tasks. It supports local AI (via Msty Studio or MLX) and cloud AI (OpenAI) to convert natural language queries into system commands, with built-in safety confirmations and OS-specific command generation.

It came in a dream. "Thank you for this gift."

Features

  • 🤖 Natural language to CLI command conversion
  • 💡 Command explanations for better understanding
  • ✅ Command confirmation before execution
  • 🔒 Secure API key storage
  • 🏠 Local AI support via Msty Studio - Run completely offline and private
  • 🍎 MLX support - Apple Silicon optimized local AI
  • OpenAI - Hyper Speed Most Efficient (Fastest) - Fastest cloud AI option
  • 💻 OS-specific command generation (macOS, Linux, Windows)
  • 🎨 Rich terminal output formatting
  • ⚙️ Easy configuration with x --configure

Installation

Option 1: Install from PyPI (Recommended)

pip install xcli-ai

Option 2: Install from Source

  1. Clone the repository:
git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Quick Start

After installation, you can use x or ! as commands:

Using Local AI (Msty Studio) - Recommended

  1. Install and start Msty Studio

    • Download Msty Studio from: https://msty.ai
    • Launch Msty Studio and ensure it's running locally
    • Msty Studio typically runs on http://localhost:1234/v1
  2. First-time setup

    • Run any x or ! command to trigger the setup wizard
    • Choose option 1 for "Local AI (Msty Studio)"
    • Enter your Msty Studio API base URL (default: http://localhost:1234/v1)
    • Enter your model name (default: mistral)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files
# or
! find large files

Using MLX (Apple Silicon) - Optimized for Mac

  1. Install and start MLX server

    • Set up an MLX-compatible server running locally
    • MLX server typically runs on http://localhost:11973/v1
  2. First-time setup

    • Run any sol command to trigger the setup wizard
    • Choose option 2 for "MLX - Apple Silicon optimized local AI"
    • Enter your MLX API base URL (default: http://localhost:11973/v1)
    • Enter your model name (default: mlx-community/Qwen2.5-0.5B-Instruct-4bit)
    • Configuration will be saved to ~/.solai.env
  3. Run a command:

x find large files
# or
! find large files

Using OpenAI - Hyper Speed Most Efficient (Fastest)

  1. First-time setup

    • Run any sol command to trigger the setup wizard
    • Choose option 3 for "OpenAI - Hyper Speed Most Efficient (Fastest)"
    • Get your API key from: https://platform.openai.com/api-keys
    • Configuration will be securely stored in ~/.solai.env
  2. Run a command:

x find large files
# or
! find large files

Example output:

Suggested command:
find ~ -type f -size +100M
→ Searches your home directory for files larger than 100 megabytes

Do you want to execute this command? [y/n]:

Sol Screenshot

Usage Examples

# Find files
x find all pdf files in downloads
! find all pdf files in downloads

# System maintenance
x clean up system cache
! clean up system cache

# Network commands
x check if google.com is up
! check if google.com is up

# File operations
x create a backup of my documents
! create a backup of my documents

# With admin privileges
x --admin install package
x -a update system
! --admin install package

Development

To install in development mode:

git clone https://github.com/caraveo/solai.git
cd solai
pip install -e .

Requirements

  • Python 3.6+
  • For Local AI: Msty Studio installed and running
  • For MLX: MLX server installed and running (Apple Silicon optimized)
  • For OpenAI - Hyper Speed Most Efficient (Fastest): OpenAI API key
  • Required packages:
    • click
    • python-dotenv
    • openai
    • rich

Configuration

Configuration is stored in ~/.solai.env. The setup wizard will guide you through the initial configuration.

Local AI Configuration (Msty Studio)

AI_PROVIDER=local
API_BASE_URL=http://localhost:1234/v1
API_KEY=not-needed
MODEL=mistral

MLX Configuration (Apple Silicon)

AI_PROVIDER=mlx
API_BASE_URL=http://localhost:11973/v1
API_KEY=not-needed
MODEL=mlx-community/Qwen2.5-0.5B-Instruct-4bit

OpenAI - Hyper Speed Most Efficient (Fastest) Configuration

AI_PROVIDER=openai
API_KEY=your-api-key-here
MODEL=gpt-3.5-turbo

To reconfigure, simply delete ~/.solai.env and run any sol command to trigger the setup process again.

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -am 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License

Contact

Jon Caraveo - caraveo@me.com

Project Link: https://github.com/caraveo/solai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xcli_ai-1.0.0.tar.gz (13.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xcli_ai-1.0.0-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file xcli_ai-1.0.0.tar.gz.

File metadata

  • Download URL: xcli_ai-1.0.0.tar.gz
  • Upload date:
  • Size: 13.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 bb1aa2c7a1cbfa6fa22e9a5211431919fe8ac09ed32f170509db320442b70fcd
MD5 a9ec4be7546065774f38d446174eb0fe
BLAKE2b-256 e9456af3b8a18c32370b498f5915a62bdcae52a1d83550f4642eb0c8fe706275

See more details on using hashes here.

File details

Details for the file xcli_ai-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: xcli_ai-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for xcli_ai-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 092c1e196942df8bf265c16d3805ae57767d694f460748cb575ac91c7429718d
MD5 69a6a0ec6a062b792052b32f143506da
BLAKE2b-256 8e777775b39b741afcac56e1ad3fe23e9642e5bc4ccbecacbbe16731e757d882

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page