Skip to main content

ARIA CLI - Natural Language Quantum AI Command Line Interface with Q0-Q38 System

Project description

ARIA CLI v0.2.0

    █████╗ ██████╗ ██╗ █████╗      ██████╗██╗     ██╗
   ██╔══██╗██╔══██╗██║██╔══██╗    ██╔════╝██║     ██║
   ███████║██████╔╝██║███████║    ██║     ██║     ██║
   ██╔══██║██╔══██╗██║██╔══██║    ██║     ██║     ██║
   ██║  ██║██║  ██║██║██║  ██║    ╚██████╗███████╗██║
   ╚═╝  ╚═╝╚═╝  ╚═╝╚═╝╚═╝  ╚═╝     ╚═════╝╚══════╝╚═╝

Advanced Recursive Intelligence Architecture

PyPI version npm version License: MIT Python 3.9+


Natural Language Quantum AI Command Line Interface with full Q0-Q38 Cognitive Architecture

✨ What's New in v0.2.0

  • 39 Cognitive Layers (Q0-Q38) - Four domains: Foundation, Reasoning, Metacognition, Transcendence
  • Multi-backend LLM Integration - OpenAI, Anthropic, Ollama, llama.cpp, Transformers
  • Q38 Cluster Connector - Native integration with 64-direction distributed processing
  • CLI-GUI Sync - Real-time synchronization between CLI and GUI interfaces
  • Plugin System - Extensible architecture for custom functionality
  • Rich Terminal UI - Beautiful output with typer + rich
  • 🔐 Authentication System - Secure login with local, API key, and SSH key auth
  • 🔗 SSH Router - Multi-host SSH management, tunnels, and remote execution

📦 Installation

From PyPI (Recommended)

pip install aria-cli

With LLM Support

pip install aria-cli[openai]      # OpenAI backend
pip install aria-cli[anthropic]   # Anthropic backend
pip install aria-cli[local]       # Local models (llama.cpp, transformers)
pip install aria-cli[full]        # All backends + rich UI

From npm (Node.js wrapper)

npm install -g @alphamatt/aria-cli

From Source

git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e .

🚀 Quick Start

# Ask a question
aria ask "What is quantum computing?"

# Search for information
aria search "trading systems"

# Show Q-System layers
aria layers

# Show system status
aria status

# Enter interactive chat mode
aria chat

# Remember something
aria remember "Meeting at 3pm tomorrow" --key meeting

# Recall memories
aria recall --key meeting

# Login
aria login

# SSH to a host
aria ssh connect production

🔐 Authentication

ARIA CLI includes a secure authentication system for managing credentials and sessions.

Login Commands

# Interactive login
aria login

# Login with API key
aria login --api-key sk-... --provider openai

# Login with SSH key
aria login --ssh-key ~/.ssh/id_rsa

# Check current user
aria whoami

# Logout
aria logout

Python API

from aria_cli import AuthManager, get_auth

# Get auth manager
auth = get_auth()

# Register a new user
auth.register("username", "password", email="user@example.com")

# Login
session = auth.login("username", "password")
print(f"Logged in as {session.user.username}")

# Check authentication
if auth.is_authenticated:
    print(f"Welcome, {auth.current_user.username}")

# Login with API key
session = auth.login_with_api_key("sk-...", provider="openai")

# Logout
auth.logout()

Credential Storage

Credentials are stored securely using:

  • System keyring (when available)
  • Encrypted file storage (fallback)
~/.aria/
├── session.json         # Current session (encrypted token)
├── .credentials         # Encrypted credential storage
└── .key                 # Encryption key (restricted permissions)

🔗 SSH Router

Manage SSH connections, execute remote commands, and create tunnels.

CLI Commands

# Add a host
aria ssh add production --hostname prod.example.com --user deploy --key ~/.ssh/id_rsa

# List hosts
aria ssh list

# Test connection
aria ssh test production

# Connect interactively
aria ssh connect production

# Execute remote command
aria ssh exec production "ls -la /var/www"

# Execute on multiple hosts
aria ssh exec production,staging "uptime"

# Create tunnel (local:8080 → remote:80)
aria ssh tunnel production --local 8080 --remote 80

# Upload file
aria ssh upload production ./local-file.txt /remote/path/

# Download file
aria ssh download production /remote/file.txt ./local-path/

Python API

from aria_cli import SSHRouter, SSHHost, get_router
from pathlib import Path

# Get router
router = get_router()

# Add a host
router.add_host(SSHHost(
    name="production",
    hostname="prod.example.com",
    username="deploy",
    key_file=Path("~/.ssh/id_rsa").expanduser(),
    port=22,
    tags=["prod", "web"],
))

# List hosts
hosts = router.list_hosts()
for host in hosts:
    print(f"{host.name}: {host.hostname}")

# Test connection
success, message = router.test_connection("production")
print(f"Connection: {message}")

# Execute command
result = router.execute("production", "ls -la")
print(result.stdout)
print(f"Exit code: {result.exit_code}")

# Execute on multiple hosts in parallel
results = router.execute_multi(
    ["production", "staging"],
    "uptime",
    parallel=True
)
for host, result in results.items():
    print(f"{host}: {result.stdout}")

# Create SSH tunnel
router.create_tunnel(
    "production",
    local_port=8080,
    remote_port=80,
)

# Create reverse tunnel
router.create_reverse_tunnel(
    "production",
    remote_port=9000,
    local_port=3000,
)

# Upload file
router.upload("production", "./local-file.txt", "/remote/path/")

# Download file
router.download("production", "/remote/file.txt", "./local-path/")

# Interactive session
router.connect_interactive("production")

# Close tunnels
router.close_tunnel("production", 8080)
router.disconnect_all()

SSH Configuration

The router automatically loads hosts from ~/.ssh/config:

# ~/.ssh/config
Host production
    HostName prod.example.com
    User deploy
    IdentityFile ~/.ssh/id_rsa
    Port 22

Host staging
    HostName staging.example.com
    User deploy
    ProxyJump bastion

Additional hosts can be stored in ~/.aria/ssh/hosts.json.

📱 AriaSpace - Phone ↔ Computer Sync

AriaSpace is a personal codespace system for securely syncing files between your phone (Android/Termux) and computer. Think GitHub Codespaces, but for your personal devices.

Architecture

    Phone (Termux)                    Computer
    ┌─────────────────┐              ┌─────────────────┐
    │  AriaSpace      │    SSH       │  AriaSpace      │
    │  Client         │◄────────────►│  Server         │
    │                 │              │                 │
    │  /storage/      │   Sync       │  ~/aria-space/  │
    │  emulated/0/    │◄────────────►│  workspaces/    │
    │  Meta AI/       │              │                 │
    └─────────────────┘              └─────────────────┘

Termux Setup (On Android)

  1. Install Termux from F-Droid
  2. Run the ARIA setup script:
# Get the setup script
aria termux setup-script | bash

# Or manually:
pkg update && pkg install openssh -y
termux-setup-storage
sshd
  1. Note your IP address: ip addr show wlan0

Connect from Computer

# Discover devices on network
aria termux discover

# Or add device manually
aria termux add-host termux-phone 192.168.1.100 --port 8022

# Test connection
aria termux test termux-phone

# Find Meta AI folder
aria termux find-meta-ai termux-phone

# List Android folders
aria termux folders termux-phone

Create Workspace

# Quick setup for Meta AI folder
aria space create-meta-ai termux-phone

# Or create custom workspace
aria space create my-docs termux-phone "/storage/emulated/0/Documents" \
    --local ~/aria-space/my-docs

# List workspaces
aria space list

Sync Files

# Bidirectional sync
aria space sync meta-ai

# Download only (phone → computer)
aria space pull meta-ai

# Upload only (computer → phone)
aria space push meta-ai

# Preview changes without syncing
aria space sync meta-ai --dry-run

Browse & Manage

# List files with status
aria space files meta-ai

# Browse remote directory
aria space browse meta-ai --path images

# Check workspace status
aria space status meta-ai

Python API

from aria_cli import AriaSpace, TermuxSetup, get_space

# Setup Termux connection
setup = TermuxSetup()
setup.wizard()  # Interactive setup

# Or programmatically
setup.append_ssh_config("termux-phone", "192.168.1.100", port=8022)

# Create workspace
space = get_space()
ws = space.create_workspace(
    name="meta-ai",
    local_root="~/aria-space/meta-ai",
    remote_root="/storage/emulated/0/Meta AI",
    remote_host="termux-phone",
)

# List files
files = space.list_files("meta-ai")
for f in files:
    print(f"{f.status.value}: {f.relative_path}")

# Sync workspace
result = space.sync("meta-ai")
print(f"Uploaded: {result.files_uploaded}, Downloaded: {result.files_downloaded}")

# Download specific file
space.download_file("meta-ai", "images/photo.jpg")

# Upload file
space.upload_file("meta-ai", "notes.txt")

# Browse remote
entries = space.browse_remote("meta-ai", "images")
for entry in entries:
    print(f"{'📁' if entry['is_dir'] else '📄'} {entry['name']}")

# Watch for changes (auto-sync)
space.watch("meta-ai", interval=60)

Supported Android Folders

Folder Path
Meta AI /storage/emulated/0/Meta AI
Downloads /storage/emulated/0/Download
Documents /storage/emulated/0/Documents
Pictures /storage/emulated/0/Pictures
DCIM /storage/emulated/0/DCIM
Movies /storage/emulated/0/Movies
Music /storage/emulated/0/Music
Termux Home /data/data/com.termux/files/home

Storage Structure

~/.aria/
├── spaces/
│   ├── workspaces.json      # Workspace configurations
│   ├── state/               # Sync state per workspace
│   │   ├── meta-ai.json
│   │   └── ...
│   └── snapshots/           # Workspace snapshots
│       └── meta-ai/
│           └── 20241222_143052.json
└── termux/
    ├── devices.json         # Discovered devices
    └── aria_termux_setup.sh # Setup script

🧠 Q-System Architecture

The Q-System is a 39-layer recursive cognitive architecture:

Configuration

ARIA CLI stores configuration in ~/.aria/:

~/.aria/
├── config.json          # CLI configuration
├── sync_state.json      # CLI-GUI sync state
└── q-memory/            # Q-System memory storage
    ├── memory_*.json    # Saved memories
    └── ...

Configuration Options

Edit ~/.aria/config.json:

{
  "llm": {
    "backend": "openai",
    "model": "gpt-4o-mini",
    "temperature": 0.7
  },
  "q_system": {
    "default_layer": 8,
    "trace_enabled": true
  },
  "ui": {
    "theme": "dark",
    "show_layer_info": true
  }
}

Environment Variables

# LLM backends
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export OLLAMA_HOST="http://localhost:11434"

# Q38 Cluster
export Q38_API_URL="http://localhost:8080"
export Q38_SYNC_PORT="9000"

🧩 Q-System Layers

Foundation (Q0-Q9)

Layer Name Description
Q0 VOID The null state - pure potential
Q1 PERCEPTION Raw sensory input processing
Q2 ATTENTION Focus and salience detection
Q3 PATTERN Pattern recognition
Q4 MEMORY_SHORT Working memory
Q5 MEMORY_LONG Long-term storage
Q6 ASSOCIATION Concept linking
Q7 CONTEXT Contextual understanding
Q8 LANGUAGE Linguistic processing
Q9 EMOTION Affective processing

Reasoning (Q10-Q19)

Layer Name Description
Q10 LOGIC Formal logical reasoning
Q11 INFERENCE Drawing conclusions
Q12 HYPOTHESIS Generating hypotheses
Q13 ABSTRACTION Abstract concept formation
Q14 ANALOGY Analogical reasoning
Q15 CAUSATION Causal reasoning
Q16 PREDICTION Future projection
Q17 EVALUATION Assessment
Q18 SYNTHESIS Combining ideas
Q19 CREATIVITY Novel generation

Action (Q20-Q29)

Layer Name Description
Q20 INTENTION Goal setting
Q21 PLANNING Strategy formation
Q22 DECISION Choice making
Q23 EXECUTION Action taking
Q24 MONITORING Progress tracking
Q25 FEEDBACK Loop processing
Q26 CORRECTION Error correction
Q27 OPTIMIZATION Performance tuning
Q28 LEARNING Knowledge acquisition
Q29 ADAPTATION Behavioral adjustment

Transcendence (Q30-Q38)

Layer Name Description
Q30 AWARENESS Self-awareness
Q31 REFLECTION Self-reflection
Q32 METACOGNITION Thinking about thinking
Q33 IDENTITY Self-identity modeling
Q34 VALUES Value system and ethics
Q35 WISDOM Applied knowledge
Q36 INTEGRATION Holistic processing
Q37 EMERGENCE Emergent properties
Q38 TRANSCENDENCE Beyond individual layers

🔌 Python API

from aria_cli import QSystem, QLayer, QCommand, QOperator

# Initialize Q-System
q = QSystem()

# Create and execute a command
cmd = QCommand(
    operator=QOperator.ASK,
    payload={"question": "What is consciousness?"},
    layer=QLayer.Q30_AWARENESS,
)
result = q.execute(cmd)
print(result)

With LLM Integration

from aria_cli import QSystem, LLMConnector, LLMProvider

# Initialize with OpenAI
q = QSystem()
llm = LLMConnector(LLMProvider.OPENAI)

# Generate with Q-System context
response = llm.generate(
    "Explain quantum consciousness",
    system="You are ARIA at Q-Layer 38"
)
print(response)

NLP Processing

from aria_cli import NLPEngine

nlp = NLPEngine()

# Parse natural language
parsed = nlp.parse("Search for files containing quantum algorithms")
print(f"Intent: {parsed.intent}")       # Intent.SEARCH
print(f"Keywords: {parsed.keywords}")   # ['files', 'quantum', 'algorithms']
print(f"Q-Layer: Q{parsed.q_layer}")    # Q3 (PATTERN)

CLI-GUI Synchronization

from aria_cli import AriaConnector, SyncMode

# File-based sync
connector = AriaConnector(mode=SyncMode.FILE, source='cli')
connector.start()

# Update state
connector.update_state(current_layer=30)

# Stop sync
connector.stop()

🔄 Migration from v0.1.x

See MIGRATION.md for detailed upgrade instructions.

Key Changes

Feature v0.1.x v0.2.0
Q-System Layers 8 39 (Q0-Q38)
LLM Integration Optional Multi-backend
Package Structure aria_cli/ src/aria_cli/
CLI Framework argparse typer + rich
Async Support
GUI Sync

📚 Commands Reference

Command Description
aria ask <question> Ask a natural language question
aria search <query> Search for information
aria chat Enter interactive chat mode
aria layers Show all 39 Q-System layers
aria status Show system status
aria remember <content> Save to memory
aria recall List/retrieve memories
aria help Show help

🔧 Development

Setup

git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e ".[dev]"

Testing

pytest

Building

pip install build
python -m build

📄 License

MIT License - see LICENSE for details.

🤝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

📬 Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aria_cli-0.2.0.tar.gz (73.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aria_cli-0.2.0-py3-none-any.whl (76.3 kB view details)

Uploaded Python 3

File details

Details for the file aria_cli-0.2.0.tar.gz.

File metadata

  • Download URL: aria_cli-0.2.0.tar.gz
  • Upload date:
  • Size: 73.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.1

File hashes

Hashes for aria_cli-0.2.0.tar.gz
Algorithm Hash digest
SHA256 5ef68a01c65dea53065096ba66611fae40b85f361961f2fed89511a454b0e2d7
MD5 7783da438a9f5931f1519eaa3849448a
BLAKE2b-256 277f26da2ce3829f88e09222f3670afdd880f40fe13f2a7c60e09c1c6f9a4a35

See more details on using hashes here.

File details

Details for the file aria_cli-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: aria_cli-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 76.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.1

File hashes

Hashes for aria_cli-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c4e0ae70efb2c3386db1ed2c07d5e9f460ead2f29a3efe342b52a8281b9fde2a
MD5 f0eea2f6268052afd03d2fa7ee1b6b8a
BLAKE2b-256 8b29ab105ffa5d98e16e7e849a3aed9618666306829faa3e6533519f53023530

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page