Advanced AI Shell Command Assistant
Project description
Bgpt
Bgpt is an AI-powered terminal assistant that converts natural language into shell scripts, explains what each command does, and runs commands with built-in safety checks. It supports Gemini, OpenAI, Anthropic, and local Ollama models so developers, DevOps engineers, and power users can automate terminal workflows faster and more safely.
Bgpt is a customizable AI-powered shell assistant. It converts natural-language requests into shell commands, applies safety checks, and executes commands with confirmation and timeout controls.
What You Get
- Multi-provider AI:
gemini,openai,anthropic,local(Ollama) - Safety pipeline with risk scoring, blocking rules, warnings, and confirmation
- Redesigned terminal UX with profile-based customization
- Agentic decision mode (auto decides run vs confirm based on risk)
- Interactive chat mode and one-shot mode
- Command history with execution metadata
- Setup wizard and diagnostics command
Installation
From source (recommended for this repository)
git clone https://github.com/primecodez01/Bgpt.git
cd Bgpt
python -m venv .venv
source .venv/bin/activate
pip install -e .
Optional dependencies
If you need all providers and extras:
pip install -r requirements.txt
Quick Start
1) Run setup wizard
bgpt --setup
2) Ask in one-shot mode
bgpt "find all python files larger than 50MB"
3) Enter chat mode
bgpt --chat
4) Explain an existing command
bgpt --explain "ls -la | grep py"
Provider Setup
Bgpt reads API keys from either:
- Environment variables
- System keyring (when saved by setup wizard)
Supported environment variables:
GEMINI_API_KEYOPENAI_API_KEYANTHROPIC_API_KEY
Optional model overrides:
BGPT_GEMINI_MODELBGPT_OPENAI_MODELBGPT_ANTHROPIC_MODELBGPT_LOCAL_MODEL
Preferred workflow: use bgpt --setup to pick provider + model and persist it in config.
Terminal Redesign And Full Customization
Bgpt now supports user-level terminal customization persisted in ~/.bgpt/config.json.
Profiles
defaultsunsetmatrixmidnightminimal
Prompt styles
arrow(default)classicminimal
UX toggles
- Compact mode on/off
- Timestamp display on/off
- Tips on/off
- Command preview lines (3-30)
Configure from CLI
bgpt config set --profile matrix --prompt-style classic
bgpt config set --compact --timestamps --tips
bgpt config set --preview-lines 20
bgpt config set --timeout 120
bgpt config set --provider openai --safety-level high
bgpt config set --model-provider openai --model gpt-4o-mini
bgpt config set --theme hacker
bgpt config set --agentic --hide-details --agentic-risk low
bgpt config show
Configure live inside chat
/profile matrix
/style classic
/theme hacker
/compact on
/timestamps off
/tips off
/preview 18
/provider gemini
/model gemini-2.5-flash
/model openai gpt-4o-mini
/agentic on
/details off
/agentic-risk low
/safety medium
/timeout 90
/config
Safety Model
Before execution, Bgpt performs:
- Syntax validation
- Command parsing and operation classification
- Safety scoring (low/medium/high)
- Hard-block checks for dangerous patterns
- Confirmation flow for risky commands
Commands can be blocked outright if they match critical destructive patterns.
Configuration Reference
Config file path:
~/.bgpt/config.json
Example:
{
"provider": "gemini",
"theme": "default",
"safety_level": "medium",
"auto_execute": false,
"agentic_mode": false,
"show_command_details": true,
"agentic_auto_execute_max_risk": "low",
"save_history": true,
"command_timeout": 60,
"enabled_plugins": ["git", "mcp"],
"models": {
"gemini": "gemini-2.5-flash",
"openai": "gpt-4o-mini",
"anthropic": "claude-3-5-sonnet-latest",
"local": "tinyllama"
},
"ui": {
"profile": "default",
"prompt_style": "arrow",
"compact_mode": false,
"show_timestamps": true,
"show_tips": true,
"command_preview_lines": 12
}
}
Common Commands
# Chat mode
bgpt --chat
# One-shot generation and execution flow
bgpt "show top 10 processes by memory"
# Diagnostics
bgpt --doctor
# Show history
bgpt --history
# Setup local Ollama model
bgpt setup-local
Plugin Commands
Current plugin registry includes:
gitdockersystemmcp
Manage plugins:
bgpt plugins list
bgpt plugins install git
bgpt plugins enable git
bgpt plugins disable git
bgpt plugins uninstall git
Local Provider (Ollama)
To prepare local/offline usage:
bgpt setup-local
This checks Ollama availability and attempts to set up a lightweight model.
Troubleshooting
No command generated
- Run
bgpt --doctor - Verify API keys are configured
- Switch provider:
bgpt config set --provider gemini
Command times out
- Increase timeout:
bgpt config set --timeout 180
Provider initializes but returns nothing
- Set explicit model override (
BGPT_*_MODEL) - Try another provider
TUI mode unavailable
- Install textual dependency:
pip install textual>=0.50.0
Development
Run editable install and checks:
pip install -e .
python -m bgpt.main --help
Suggested GitHub Topics
- ai-cli
- terminal-assistant
- natural-language-to-shell
- shell-automation
- command-generation
- devtools
- productivity
- llm-tools
- agentic-cli
- command-safety
- python-cli
- mcp
- ollama
- openai
- anthropic
- gemini
License Ideas
- MIT (current): simple permissive license, best for fast adoption
- Apache-2.0: permissive plus explicit patent grant
- GPL-3.0: strong copyleft for derivative work
- AGPL-3.0: copyleft including network/SaaS use
Recommended default for this project: MIT or Apache-2.0.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bgpt_primecodez-1.0.0.tar.gz.
File metadata
- Download URL: bgpt_primecodez-1.0.0.tar.gz
- Upload date:
- Size: 41.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca76f91bf7f15dafe12748e3643cfc19176230780fd774bb02af457791f2c3d7
|
|
| MD5 |
751b681fe5ca2402ab11a42ce26ee234
|
|
| BLAKE2b-256 |
c834ca8a52059c2adf1fc23558caa882727daee53fa70bd50c24bf6d03437038
|
File details
Details for the file bgpt_primecodez-1.0.0-py3-none-any.whl.
File metadata
- Download URL: bgpt_primecodez-1.0.0-py3-none-any.whl
- Upload date:
- Size: 44.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
59ee5be4b57626d3a5060f79ec2938f8e44e24389301f26682d89872043e426d
|
|
| MD5 |
f3e52377d97ae2df5b05655f0fa1bad7
|
|
| BLAKE2b-256 |
64fbc50106635adf3275ff1e3cbe99e5207538b5a2b8389f47ec7bdadffb1c25
|