A next-generation AI-integrated TUI terminal emulator.
Project description
Null Terminal
"Shell in the Void."
[!NOTE] Null Terminal is under active development and not yet officially released. Features and APIs may change.
Null is a next-generation TUI (Terminal User Interface) designed for the modern AI-integrated workflow. Built on Textual, it blends the raw power of the command line with the intelligence of LLMs, all wrapped in a sleek, cyber-noir aesthetic.
Why Null?
- Two Modes, One Interface: Seamlessly switch between CLI and AI mode with
Ctrl+Space - Block-Based Output: Every command and response is a distinct, interactive block
- 20+ AI Providers: From local Ollama to cloud providers like OpenAI, Anthropic, and Google
- Agent Mode: Let the AI execute multi-step tasks autonomously
- MCP Integration: Extend AI capabilities with Model Context Protocol servers
Features
AI Integration
| Feature | Description |
|---|---|
| Multi-Provider | Ollama, OpenAI, Anthropic, Google, Azure, Bedrock, Groq, Mistral, DeepSeek, and more |
| Agent Mode | Autonomous multi-step task execution with tool calling |
| RAG / Code Search | Index your codebase with /index build for semantic search |
| Reasoning Display | See the AI's thinking process for compatible models |
| Context Inspector | View exactly what the AI sees with /context |
| Cost Tracking | Real-time token usage and cost display in status bar |
Developer Tools
| Feature | Description |
|---|---|
| Task Manager | Integrated todo dashboard (/todo) |
| Prompt Editor | Custom system prompts and personas (/prompts) |
| Git Integration | Branch and status in status bar |
| File Explorer | Sidebar file tree (Ctrl+\) |
| Session Export | Export conversations to Markdown/JSON |
| SSH Manager | Save and connect to remote hosts |
MCP (Model Context Protocol)
| Feature | Description |
|---|---|
| Server Catalog | Pre-configured popular MCP servers |
| Tool Discovery | Automatic tool registration from servers |
| Resource Access | Read external resources (databases, APIs) |
| Management UI | /mcp commands for full control |
UX
| Feature | Description |
|---|---|
| Block Interface | Distinct visual blocks for each interaction |
| 10+ Themes | Null Dark, Monokai, Dracula, and custom themes |
| Command Palette | Quick access with Ctrl+P |
| Interactive TUI | Run vim, htop, ssh inside blocks |
| History Search | Ctrl+R for command history |
Quick Start
Installation
Via pipx (Recommended):
pipx install null-terminal
null
Via Docker:
docker run -it --rm ghcr.io/starhound/null-terminal:latest
From Source:
git clone https://github.com/starhound/null-terminal.git
cd null-terminal
uv sync
uv run main.py
See Installation Guide for Windows, advanced options, and troubleshooting.
First Run
- Configure AI Provider: Type
/settingsor pressF3 - Select a Model: Press
F2or type/model - Toggle AI Mode: Press
Ctrl+Spaceto switch between CLI and AI
Basic Usage
# CLI Mode (default)
ls -la # Run shell commands
cd ~/projects # Navigate directories
# AI Mode (Ctrl+Space to toggle)
Explain this error # Ask questions
Refactor this function # Get code help
# Slash Commands (always available)
/help # Show help
/model # Select AI model
/agent # Toggle agent mode
/todo # Task manager
/theme dracula # Change theme
Keyboard Shortcuts
| Shortcut | Action |
|---|---|
Ctrl+Space |
Toggle CLI / AI mode |
Ctrl+P |
Command palette |
Ctrl+\ |
Toggle file sidebar |
Ctrl+R |
History search |
Ctrl+F |
Search blocks |
Ctrl+L |
Clear history |
F1 |
Help screen |
F2 |
Model selector |
F3 |
Theme selector |
F4 |
Provider selector |
Escape |
Cancel / Close |
AI Providers
Local (Free)
| Provider | Setup |
|---|---|
| Ollama | ollama pull llama3.2 then /provider ollama |
| LM Studio | Start server, then /provider lm_studio |
| Llama.cpp | Start server, then /provider llama_cpp |
Cloud
| Provider | Models |
|---|---|
| OpenAI | GPT-4o, GPT-4 Turbo, o1 |
| Anthropic | Claude 3.5 Sonnet, Claude 3 Opus |
| Gemini 2.0 Flash, Gemini 1.5 Pro | |
| Azure | Azure OpenAI deployments |
| AWS Bedrock | Claude, Titan, Llama |
| Groq | Llama 3.3, Mixtral (fast) |
| Mistral | Mistral Large, Codestral |
| DeepSeek | DeepSeek Chat, DeepSeek Coder |
See Providers Guide for full list and configuration.
Agent Mode
Enable autonomous task execution:
/agent # Toggle agent mode
# Then ask:
"Create a Python script that fetches weather data and saves it to weather.json"
The agent will:
- Plan the approach
- Execute tools (read/write files, run commands)
- Iterate until the task is complete
Safety features:
- Tool approval prompts for dangerous operations
- Maximum 10 iterations per task
- Cancel anytime with
Escape
MCP Integration
Add external tools via Model Context Protocol:
/mcp catalog # Browse available servers
/mcp add # Add a server manually
/mcp tools # List available tools
Popular MCP servers:
- Brave Search - Web search
- Filesystem - File operations
- PostgreSQL - Database queries
- GitHub - Repository management
Configuration
Settings are stored in ~/.null/:
| File | Purpose |
|---|---|
config.json |
User preferences |
null.db |
Sessions, encrypted API keys |
mcp.json |
MCP server configs |
themes/ |
Custom themes |
prompts/ |
Custom system prompts |
Documentation
| Guide | Description |
|---|---|
| User Guide | Complete usage instructions |
| Commands Reference | All slash commands |
| Providers Guide | AI provider setup |
| MCP Guide | MCP server configuration |
| Themes Guide | Customizing appearance |
| SSH Guide | Remote connections |
For Contributors
| Guide | Description |
|---|---|
| Architecture | System design overview |
| Development | Dev environment setup |
| Contributing | How to contribute |
| Feature Specs | Planned features |
Tech Stack
- Textual - TUI framework
- httpx - Async HTTP client
- Python 3.12+ - Async/await, type hints
- SQLite - Local storage
- Fernet - API key encryption
Contributing
We welcome contributions! See CONTRIBUTING.md for guidelines.
# Development setup
git clone https://github.com/starhound/null-terminal.git
cd null-terminal
uv sync
uv run pytest # Run tests
uv run main.py # Run app
uv run textual console # Debug console
Built with 🖤 by Starhound
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file null_terminal-0.0.5.tar.gz.
File metadata
- Download URL: null_terminal-0.0.5.tar.gz
- Upload date:
- Size: 7.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f807602771ef57a0cb3ef9c50a8099e0596ac18df322d6207a422b123875599
|
|
| MD5 |
99d8313d8bbd2ee6949f5e95e5f24c23
|
|
| BLAKE2b-256 |
83fda41b4642a4e1cc20add8dd0f0d581135db3041e9923bd6e667b42d9b5451
|
File details
Details for the file null_terminal-0.0.5-py3-none-any.whl.
File metadata
- Download URL: null_terminal-0.0.5-py3-none-any.whl
- Upload date:
- Size: 345.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
79dca618c70b6a01520171db55380297874becffb2851f87393add72f61a6f1e
|
|
| MD5 |
651e2c4ec37b2222a0e15b368302853c
|
|
| BLAKE2b-256 |
3401c7108d53a0801742d4276462e303097d547f307b41fe52628eb693111ee4
|