AI command-line agent for terminal automation and task execution
Project description
VEXIS-CLI-1
AI-Powered Command Line Interface for Intelligent Automation
VEXIS-CLI-1 is an AI agent derived from VEXIS-1.1 that performs tasks through command execution. It leverages large language models to intelligently interpret natural language commands and execute them through terminal operations, enabling automated workflow management and system administration.
Quick Start • Documentation • Models • Configuration • Contributing
Key Features
AI-Powered Intelligence
- Natural Language Processing: Execute commands using plain English descriptions
- Context-Aware Execution: Understands your workflow and adapts to your needs
- Multi-Model Support: Compatible with 80+ AI models from 12 major providers
- Smart Verification: Automatic task completion validation with confidence scoring
Advanced Automation
- Two-Phase Engine: Planning and execution phases for reliable task completion
- Cross-Platform Compatibility: Works seamlessly on macOS, Linux, and Windows
- GUI Automation: Integrate terminal commands with graphical interface interactions
- Screenshot Integration: Visual context capture for enhanced understanding
Developer Experience
- Rich Terminal UI: Beautiful, informative output with progress indicators
- Flexible Configuration: YAML-based settings with environment variable overrides
- Extensible Architecture: Plugin-ready design for custom integrations
- Comprehensive Logging: Structured logging for debugging and monitoring
🚀 Quick Start
Prerequisites
- Python 3.8 or higher
- Ollama installed and running (for local AI models)
- Git (for cloning the repository)
Installation
-
Clone the repository
git clone https://github.com/AInohogosya-team/VEXIS-CLI-1.git cd VEXIS-CLI-1
-
Install dependencies
pip install -r requirements.txt
-
Set up Ollama (optional, for local models)
# Install Ollama curl -fsSL https://ollama.com/install.sh | sh # Pull a recommended model ollama pull llama3.2:latest
-
Run VEXIS-CLI
python run.py
Your First Command
# Start the interactive interface
python run.py
# Or use direct commands
vexis-cli "List all Python files in the current directory"
Supported AI Models
VEXIS-CLI-1 supports 150+ models from 20 major providers through Ollama:
Core Providers
- Meta: Llama 3.1/3.2 series (8B, 70B, 1B, 3B variants)
- Google: Gemma 2/3 series (1B-27B parameters, multimodal capabilities)
- DeepSeek: R1/V3/Coder series (8B-671B, reasoning and coding specialists)
- Microsoft: Phi-3/4 series (3.8B-14B, efficient small models)
- Mistral: Mistral/Large/Ministral series (7B-675B, European open-source leader)
Advanced Providers
- Alibaba (Qwen): Qwen 2.5/3 series (0.5B-235B, multilingual with 128K+ context)
- IBM: Granite/Code series (350M-34B, enterprise-grade models)
- BigCode: StarCoder 2 series (3B-15B, specialized for code generation)
- Cohere: Command R series (7B-35B, retrieval-augmented generation)
- 01.AI: Yi series (1.5B-34B, bilingual models)
Specialized Models
- Vision-Language: LLaVA, Moondream, Qwen3-VL (7B-235B)
- Coding: DeepSeek Coder, Qwen Coder, Granite Code, StarCoder 2
- Agentic: Hermes 3, Reflection, Devstral Small 2 (3B-405B)
- Cloud-Only: GPT-OSS, Gemini 3, GLM-5, MiniMax, Kimi (20B-744B)
Cloud & Local Models
- Local Models: Run entirely on your machine with Ollama
- Cloud Models: Access high-performance models via API
- Hybrid Mode: Seamlessly switch between local and cloud models
Complete Model List
Popular Local Models:
llama3.2:latest(3B) - Balanced performance with 128K contextqwen2.5:7b- Multilingual capabilities with 128K contextmistral:7b- Fast and efficient with 32K contextdeepseek-r1:8b- Advanced reasoning with 128K contextgemma2:9b- High-performing with 8K contextphi3:mini- Efficient small model with 4K context
High-Performance Cloud Models:
deepseek-v3:671b- State-of-the-art MoE with 160K contextqwen3:235b- Advanced MoE with 256K contextmistral-large-3:675b-cloud- Multimodal enterprise modelgpt-oss:120b-cloud- Frontier performancegemini-3-flash-preview:cloud- Built for speed
Usage Examples
Quick Start
# Start the interactive interface
python run.py
# Or use direct commands
vexis-cli "List all Python files in the current directory"
For detailed usage examples, see our Detailed Guide.
Documentation
- 📖 Detailed Guide - Comprehensive usage examples and advanced features
- 🔧 Troubleshooting - Common issues and solutions
- 📚 API Reference
- 🏗️ Architecture
- ⚙️ Configuration
- 🤝 Contributing
Community
- GitHub Issues: Report bugs and request features
- Discussions: Join the community discussion
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- 📧 Email: AInohogosya@proton.me
- X: AInohogosya
- Home Page: https://ainohogosya.github.io/home-page/
Made with ❤️ by the AInohogosya-team
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file vexis_cli-1.0.7.tar.gz.
File metadata
- Download URL: vexis_cli-1.0.7.tar.gz
- Upload date:
- Size: 101.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e84df59cca7361bf7f407d68ffda11b13d64c98f5611d39290103b9b94efb14e
|
|
| MD5 |
66bc38523ed1a97f535507b1d83c4ce6
|
|
| BLAKE2b-256 |
3c2c3c6e007dc3c6479d61298d6b049f56319428b3ae97085fe019bdc9080215
|
File details
Details for the file vexis_cli-1.0.7-py3-none-any.whl.
File metadata
- Download URL: vexis_cli-1.0.7-py3-none-any.whl
- Upload date:
- Size: 121.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5453e885fae1e865ac3823980a313896d5bb86eeccd42c288bb9b5578eeb164b
|
|
| MD5 |
a40316644a40b5c1a96802af9f55b66b
|
|
| BLAKE2b-256 |
c573a4c58c83086dcbf5eca0d783da9ca7d27a6576e7907f888ca509e73ae8e7
|