Skip to main content

An agentic CLI coding assistant powered by local LLMs via Ollama

Project description

AI-Cortex

An agentic CLI coding assistant powered entirely by local LLMs via Ollama.

AI-Cortex gives you an agentic coding assistant that runs 100% locally — no API keys, no cloud, full privacy. It uses a local LLM (default: qwen3.5:9b) and an agentic loop that decides which tools to call, executes them, and feeds results back to the model until the task is done.

Features

  • Agentic loop — model autonomously decides which tools to call and chains them together
  • 6 built-in toolsread_file, write_file, run_bash, list_dir, search_files, fetch_url
  • Streaming output — responses appear token by token
  • Permission prompts — asks before running bash commands or writing files
  • Loop detection — catches the model repeating the same action
  • Error handling — retries on failures, catches malformed tool calls
  • Configurable — model, host via ~/.ai-cortex/config.yaml or CLI flags

Prerequisites

  • Ollama installed and running
  • A local model pulled (e.g. ollama pull qwen3.5:9b)

Install

pip install ai-cortex

Or install from source:

git clone https://github.com/hasanjawad001/ai-cortex.git
cd ai-cortex
uv venv --python 3.12
source .venv/bin/activate
uv pip install -e .

Usage

# Start ai-cortex
ai-cortex

# Skip permission prompts
ai-cortex --yolo

# Use a different model
ai-cortex --model=llama3:8b

Example

you> read pyproject.toml and tell me the project name
  → read_file({"path": "pyproject.toml"})
  [project]
  name = "ai-cortex"
  ...
ai-cortex> The project name is "ai-cortex", version 0.1.0.

Configuration

AI-Cortex creates ~/.ai-cortex/config.yaml on first run:

model: qwen3.5:9b
host: http://localhost:11434

How It Works

  1. You type a request
  2. The LLM decides which tool(s) to call
  3. AI-Cortex executes the tool and feeds the result back
  4. The LLM either calls another tool or responds with a summary
  5. Repeat until done (max 15 turns)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_cortex-0.1.0.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_cortex-0.1.0-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file ai_cortex-0.1.0.tar.gz.

File metadata

  • Download URL: ai_cortex-0.1.0.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for ai_cortex-0.1.0.tar.gz
Algorithm Hash digest
SHA256 4e2ff872925589f18ea2c4dda768afdd27edda9bf1e5f8b166cf3f08b1df28e6
MD5 e8212f5f0cc27b087e6512fd6ebfbb20
BLAKE2b-256 94a4d9cea9f0ca50965720fe0998365579f59936cd722767b70f839b82322873

See more details on using hashes here.

File details

Details for the file ai_cortex-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: ai_cortex-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for ai_cortex-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0268e2f9b894263eed5c0d3885f81a5676bb96d3f819ddc72815d9561eb74820
MD5 18bf14d1fdff0b11544babbc7e00ed79
BLAKE2b-256 4f543a6b03488a55f1c8ac68b0c1be8ec938b1a92a00b87dd4969a65a710bc5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page