Skip to main content

A next-generation AI-integrated TUI terminal emulator.

Project description

Null Terminal

"Shell in the Void."

Python 3.12+ Built with Textual

[!NOTE] Null Terminal is under active development and not yet officially released. Features and APIs may change.

Null is a next-generation TUI (Terminal User Interface) designed for the modern AI-integrated workflow. Built on Textual, it blends the raw power of the command line with the intelligence of LLMs, all wrapped in a sleek, cyber-noir aesthetic.

Null Terminal Demo

Why Null?

  • Two Modes, One Interface: Seamlessly switch between CLI and AI mode with Ctrl+Space
  • Block-Based Output: Every command and response is a distinct, interactive block
  • 20+ AI Providers: From local Ollama to cloud providers like OpenAI, Anthropic, and Google
  • Agent Mode: Let the AI execute multi-step tasks autonomously
  • MCP Integration: Extend AI capabilities with Model Context Protocol servers

Features

AI Integration

Feature Description
Multi-Provider Ollama, OpenAI, Anthropic, Google, Azure, Bedrock, Groq, Mistral, DeepSeek, and more
Agent Mode Autonomous multi-step task execution with tool calling
RAG / Code Search Index your codebase with /index build for semantic search
Reasoning Display See the AI's thinking process for compatible models
Context Inspector View exactly what the AI sees with /context
Cost Tracking Real-time token usage and cost display in status bar

Developer Tools

Feature Description
Task Manager Integrated todo dashboard (/todo)
Prompt Editor Custom system prompts and personas (/prompts)
Git Integration Branch and status in status bar
File Explorer Sidebar file tree (Ctrl+\)
Session Export Export conversations to Markdown/JSON
SSH Manager Save and connect to remote hosts

MCP (Model Context Protocol)

Feature Description
Server Catalog Pre-configured popular MCP servers
Tool Discovery Automatic tool registration from servers
Resource Access Read external resources (databases, APIs)
Management UI /mcp commands for full control

UX

Feature Description
Block Interface Distinct visual blocks for each interaction
10+ Themes Null Dark, Monokai, Dracula, and custom themes
Command Palette Quick access with Ctrl+P
Interactive TUI Run vim, htop, ssh inside blocks
History Search Ctrl+R for command history

Quick Start

Installation

Via pipx (Recommended):

pipx install null-terminal
null

Via Docker:

docker run -it --rm ghcr.io/starhound/null-terminal:latest

From Source:

git clone https://github.com/starhound/null-terminal.git
cd null-terminal
uv sync
uv run main.py

See Installation Guide for Windows, advanced options, and troubleshooting.

First Run

  1. Configure AI Provider: Type /settings or press F3
  2. Select a Model: Press F2 or type /model
  3. Toggle AI Mode: Press Ctrl+Space to switch between CLI and AI

Basic Usage

# CLI Mode (default)
ls -la                    # Run shell commands
cd ~/projects             # Navigate directories

# AI Mode (Ctrl+Space to toggle)
Explain this error        # Ask questions
Refactor this function    # Get code help

# Slash Commands (always available)
/help                     # Show help
/model                    # Select AI model
/agent                    # Toggle agent mode
/todo                     # Task manager
/theme dracula            # Change theme

Keyboard Shortcuts

Shortcut Action
Ctrl+Space Toggle CLI / AI mode
Ctrl+P Command palette
Ctrl+\ Toggle file sidebar
Ctrl+R History search
Ctrl+F Search blocks
Ctrl+L Clear history
F1 Help screen
F2 Model selector
F3 Theme selector
F4 Provider selector
Escape Cancel / Close

AI Providers

Local (Free)

Provider Setup
Ollama ollama pull llama3.2 then /provider ollama
LM Studio Start server, then /provider lm_studio
Llama.cpp Start server, then /provider llama_cpp

Cloud

Provider Models
OpenAI GPT-4o, GPT-4 Turbo, o1
Anthropic Claude 3.5 Sonnet, Claude 3 Opus
Google Gemini 2.0 Flash, Gemini 1.5 Pro
Azure Azure OpenAI deployments
AWS Bedrock Claude, Titan, Llama
Groq Llama 3.3, Mixtral (fast)
Mistral Mistral Large, Codestral
DeepSeek DeepSeek Chat, DeepSeek Coder

See Providers Guide for full list and configuration.


Agent Mode

Enable autonomous task execution:

/agent                    # Toggle agent mode

# Then ask:
"Create a Python script that fetches weather data and saves it to weather.json"

The agent will:

  1. Plan the approach
  2. Execute tools (read/write files, run commands)
  3. Iterate until the task is complete

Safety features:

  • Tool approval prompts for dangerous operations
  • Maximum 10 iterations per task
  • Cancel anytime with Escape

MCP Integration

Add external tools via Model Context Protocol:

/mcp catalog              # Browse available servers
/mcp add                  # Add a server manually
/mcp tools                # List available tools

Popular MCP servers:

  • Brave Search - Web search
  • Filesystem - File operations
  • PostgreSQL - Database queries
  • GitHub - Repository management

Configuration

Settings are stored in ~/.null/:

File Purpose
config.json User preferences
null.db Sessions, encrypted API keys
mcp.json MCP server configs
themes/ Custom themes
prompts/ Custom system prompts

Documentation

Guide Description
User Guide Complete usage instructions
Commands Reference All slash commands
Providers Guide AI provider setup
MCP Guide MCP server configuration
Themes Guide Customizing appearance
SSH Guide Remote connections

For Contributors

Guide Description
Architecture System design overview
Development Dev environment setup
Contributing How to contribute
Feature Specs Planned features

Tech Stack

  • Textual - TUI framework
  • httpx - Async HTTP client
  • Python 3.12+ - Async/await, type hints
  • SQLite - Local storage
  • Fernet - API key encryption

Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

# Development setup
git clone https://github.com/starhound/null-terminal.git
cd null-terminal
uv sync
uv run pytest                    # Run tests
uv run main.py                   # Run app
uv run textual console           # Debug console

Built with 🖤 by Starhound

Star us on GitHub

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

null_terminal-0.0.6.tar.gz (7.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

null_terminal-0.0.6-py3-none-any.whl (370.6 kB view details)

Uploaded Python 3

File details

Details for the file null_terminal-0.0.6.tar.gz.

File metadata

  • Download URL: null_terminal-0.0.6.tar.gz
  • Upload date:
  • Size: 7.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for null_terminal-0.0.6.tar.gz
Algorithm Hash digest
SHA256 0da25cb76dbbb3f536a9634ba2010b6acd6fecd8aa1c00df0743cfd83bb75c28
MD5 8a6ce3eeab7d6fc5d691d39b5442f30b
BLAKE2b-256 d6623c71e89255083339544bb4f4136841952e69ace8dda9aaab5da4523f1c1f

See more details on using hashes here.

File details

Details for the file null_terminal-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: null_terminal-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 370.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for null_terminal-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 bdf2ea7ce796fb41d58da64d149bca9b7d9b828dff81921393c57337aa637465
MD5 a7cc0ed3287ce054444900fbaa39bcc4
BLAKE2b-256 e9641230df7f30b0ce762cf6c1b2abb38a5c747f409fec76ced09ab4449b228a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page