Skip to main content

A CLI tool for agentic coding tasks using local LLMs with tool calling

Project description

Ollama Code CLI

PyPI version Python 3.13+ License: MIT

Ollama Code CLI is an open-source AI agent that brings the power of local LLMs through Ollama, right in your terminal, with advanced tool-calling features.


Table of Contents


Features

  • 🎨 Elegant CLI Interface: Rich colors and structured output
  • 🤖 Local AI Power: Interact with local LLMs through Ollama
  • 🛠️ Tool Calling: Execute coding-related tools (file operations, code execution, etc.)
  • 🔒 Permission Prompts: Safety prompts before executing potentially dangerous operations
  • 💬 Interactive Mode: Maintain conversation context for multi-turn interactions
  • 📝 Markdown Support: Elegantly formatted responses with syntax highlighting
  • 📋 Structured Output: Clear panels and tables for tool calls and results

Installation

First, install a compatible model in Ollama:

# Choose one of these models:
ollama pull qwen3:4b
ollama pull qwen2.5:3b

Then install the CLI:

pip install ollama-code-cli

Requirements

  • Python 3.13+
  • Ollama installed and running
  • An Ollama model that supports tool calling (e.g., Qwen3, Qwen2.5, etc.)

Usage

Start an interactive session:

ollama-code-cli --model qwen3:4b

Run a single command:

ollama-code-cli "Create a Python function to calculate factorial"

Use a specific model:

ollama-code-cli --model qwen3:4b "Explain how async/await works in Python"

Disable permission prompts (use with caution):

ollama-code-cli --no-permission "Create and run a Python script"

Security Features

The CLI includes built-in security features to protect against potentially dangerous operations:

Permission Prompts

By default, the CLI will ask for your permission before executing potentially dangerous operations such as:

  • Writing or modifying files
  • Executing code
  • Running shell commands
  • Running Python files

Safe Operations

These operations are considered safe and don't require permission:

  • Reading files
  • Listing directory contents

Bypassing Permission Prompts

You can disable permission prompts using the --no-permission flag, but this should be used with caution:

ollama-code-cli --no-permission "Your prompt here"

Warning: Disabling permission prompts allows the AI to execute operations without user confirmation. Only use this in trusted environments.


Available Tools

  • read_file: Read the contents of a file
  • write_file: Write content to a file
  • execute_code: Execute code in a subprocess
  • list_files: List files in a directory
  • run_command: Run a shell command

Examples

1. Create a Python script and save it to a file:

ollama-code-cli "Create a Python script that calculates factorial and save it to a file named factorial.py"

2. Read a file and explain its contents:

ollama-code-cli "Read the contents of main.py and explain what it does"

3. Execute a shell command:

ollama-code-cli "List all files in the current directory"

Interactive Mode

Launch the interactive mode for a conversational experience:

ollama-code-cli

In interactive mode, you can:

  • Have multi-turn conversations with the AI
  • See elegantly formatted responses with Markdown support
  • Watch tool calls and results in real-time with visual panels
  • Clear conversation history with the clear command
  • Exit gracefully with the exit command

Project Structure

ollama-code-cli/
├── ollama_code_cli/
│   ├── __init__.py
│   ├── cli/
│   │   ├── __init__.py
│   │   └── cli.py          # Main CLI interface
│   ├── tools/
│   │   ├── __init__.py
│   │   └── tool_manager.py # Tool implementations
├── pyproject.toml          # Project configuration
├── LICENSE
└── README.md

Dependencies


Contributing

Contributions are welcome! Please open an issue or submit a pull request for any improvements, bug fixes, or suggestions.


License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_code_cli-1.0.3.tar.gz (11.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollama_code_cli-1.0.3-py3-none-any.whl (14.2 kB view details)

Uploaded Python 3

File details

Details for the file ollama_code_cli-1.0.3.tar.gz.

File metadata

  • Download URL: ollama_code_cli-1.0.3.tar.gz
  • Upload date:
  • Size: 11.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.14

File hashes

Hashes for ollama_code_cli-1.0.3.tar.gz
Algorithm Hash digest
SHA256 c822aac6abada4f460af639d22779d9729bdff35eace2b604b88310eb8193b89
MD5 eb621409b90ee32ccb807d32ee96d7c9
BLAKE2b-256 9f764613f9105686fea485eed902eb1fc43878fbb5b6e7bb31a3ee2152b3616b

See more details on using hashes here.

File details

Details for the file ollama_code_cli-1.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for ollama_code_cli-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 87cb5446f16945eacf160aa47f47a8ef32e96718e1ca60ef28a05e9be81b5bcb
MD5 59e971b9cf1acd12772dec38eeafedc0
BLAKE2b-256 074afe5f6ad2fb6d8176cf9703692c27ed92b0f5f6bd77c515155a34c409341a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page