Terminal-based GPT CLI with memory, markdown, and stream support
Project description
BLACKCORTEX GPT CLI
A Conversational Assistant for the Terminal
A terminal-based GPT assistant powered by the OpenAI API, developed by Konijima and now maintained under the BlackCortex organization.
Features
- Persistent memory across sessions with summarization
- Streaming output support
- Command history and logging
- Configurable prompt, model, and temperature
.env-based secure configuration
Installation
Requires Python 3.8+.
Using PyPI
pip install blackcortex-gpt-cli
Using pipx (recommended)
pipx install blackcortex-gpt-cli
From GitHub
pip install git+https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
# or with pipx
pipx install git+https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
Development Installation
git clone https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
cd blackcortex-gpt-cli
make install
Environment Setup
Create a .env file to configure your API and options:
touch ~/.gpt-cli/.env
Sample .env
OPENAI_API_KEY=your-api-key-here
OPENAI_MODEL=gpt-4o
OPENAI_DEFAULT_PROMPT=You are a helpful CLI assistant.
OPENAI_LOGFILE=~/.gpt.log
OPENAI_TEMPERATURE=0.5
OPENAI_MAX_TOKENS=4096
OPENAI_MAX_SUMMARY_TOKENS=2048
OPENAI_MEMORY_PATH=~/.gpt_memory.json
OPENAI_STREAM_ENABLED=false
Usage
After installation, use the gpt command globally.
gpt [-h] [--no-markdown] [--stream] [--reset] [--summary] [--env]
[--set-key [API_KEY]] [--ping] [--log] [--clear-log]
[--update] [--uninstall] [--version] [input_data ...]
Positional Arguments
input_data– One-shot prompt input. Example:gpt "Summarize the history of aviation"
Options
-h, --help— Show help message and exit--no-markdown— Disable Markdown formatting in output--stream— Enable live token streaming during response--reset— Reset memory and exit--summary— Display current conversation summary--env— Edit the.envfile--set-key [API_KEY]— Update your OpenAI API key--ping— Test connection with OpenAI API--log— Show the full conversation log--clear-log— Clear the conversation log file--update— Update GPT CLI to the latest version--uninstall— Uninstall GPT CLI completely--version— Display the current version
Environment Configuration
The GPT CLI loads settings from two locations:
.envfile in the current working directory (if present)~/.gpt-cli/.env(default persistent configuration)
You can configure model behavior, memory, logging, and streaming options.
Sample .env File
OPENAI_API_KEY=your-api-key-here # Required
OPENAI_MODEL=gpt-4o # Model ID (default: gpt-4o)
OPENAI_DEFAULT_PROMPT=You are a helpful assistant.
OPENAI_LOGFILE=~/.gpt.log # Log file location
OPENAI_TEMPERATURE=0.5 # Response randomness (default: 0.5)
OPENAI_MAX_TOKENS=4096 # Max response tokens
OPENAI_MAX_SUMMARY_TOKENS=2048 # Max tokens for memory summarization
OPENAI_MEMORY_PATH=~/.gpt_memory.json # Path to memory file
OPENAI_MEMORY_LIMIT=10 # Number of recent messages stored (default: 10)
OPENAI_STREAM_ENABLED=false # Enable token-by-token streaming (true/false)
Use
gpt --envto open and edit the.envfile in your terminal editor.
Memory System
Memory includes:
- Rolling conversation summary
- The 10 most recent messages
Older messages are summarized once the limit is reached. Use --reset to clear memory.
Troubleshooting
- Missing API key: Check
.envforOPENAI_API_KEY - Client init failed: Verify internet and credentials
- Token limit exceeded: Reduce input size or use summarization
Interactive Example Output
You: Tell me a joke about databases
GPT: Why did the database break up with the spreadsheet?
Because it couldn't handle the rows of emotions.
────────────────────────────────────────────────────────
Contributing
We welcome all contributions!
🚀 Quickstart for Development
git clone https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
cd blackcortex-gpt-cli
make install
Run tests:
make test # uses virtualenv (.venv)
# or use 'make ci-release' if running outside .venv
Lint and format:
make lint
make format
Use make check to lint, test, build, and validate in .venv.
Use make ci-release for system Python (e.g., CI/CD pipelines).
✅ Pre-commit Hook
We use pre-commit for consistent formatting:
pre-commit install
pre-commit run --all-files
📄 See CONTRIBUTING.md for full details.
License
This project is licensed under the MIT License, an OSI-approved open source license that permits the following:
- ✅ Free use for personal, academic, or commercial purposes
- ✅ Permission to modify, merge, publish, and distribute the software
- ✅ Usage with or without attribution (attribution encouraged but not required)
- ✅ No warranty is provided — use at your own risk
Credits
Originally created by Konijima, now maintained by the BlackCortex team.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file blackcortex_gpt_cli-1.2.1.tar.gz.
File metadata
- Download URL: blackcortex_gpt_cli-1.2.1.tar.gz
- Upload date:
- Size: 14.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
73d82e528fe6c2f0e2e8a0a01c4a65c9d29c9e889ef0ebeff67c6670f317c6b2
|
|
| MD5 |
c7bf7153480984a119036f420582f8cc
|
|
| BLAKE2b-256 |
d118f8e749d1ea3bc86e0024214719bf6c8d997720f0c6f8255512b96c181b41
|
Provenance
The following attestation bundles were made for blackcortex_gpt_cli-1.2.1.tar.gz:
Publisher:
publish.yml on BlackCortexAgent/blackcortex-gpt-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
blackcortex_gpt_cli-1.2.1.tar.gz -
Subject digest:
73d82e528fe6c2f0e2e8a0a01c4a65c9d29c9e889ef0ebeff67c6670f317c6b2 - Sigstore transparency entry: 199663077
- Sigstore integration time:
-
Permalink:
BlackCortexAgent/blackcortex-gpt-cli@70c3f02828dfd5054c2b870f283aebe0b501fe3a -
Branch / Tag:
refs/tags/v1.2.1 - Owner: https://github.com/BlackCortexAgent
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@70c3f02828dfd5054c2b870f283aebe0b501fe3a -
Trigger Event:
push
-
Statement type:
File details
Details for the file blackcortex_gpt_cli-1.2.1-py3-none-any.whl.
File metadata
- Download URL: blackcortex_gpt_cli-1.2.1-py3-none-any.whl
- Upload date:
- Size: 12.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7ef8cebf35042c41201cc0c2434800b86633f00616dc6d2623f521ce1d781c4c
|
|
| MD5 |
31f0555d80653c304234b67c47d23597
|
|
| BLAKE2b-256 |
ddbb65f43e418ee6f2a2477bce88c50b3c63bc6770b6e438acf705a997eb79f4
|
Provenance
The following attestation bundles were made for blackcortex_gpt_cli-1.2.1-py3-none-any.whl:
Publisher:
publish.yml on BlackCortexAgent/blackcortex-gpt-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
blackcortex_gpt_cli-1.2.1-py3-none-any.whl -
Subject digest:
7ef8cebf35042c41201cc0c2434800b86633f00616dc6d2623f521ce1d781c4c - Sigstore transparency entry: 199663079
- Sigstore integration time:
-
Permalink:
BlackCortexAgent/blackcortex-gpt-cli@70c3f02828dfd5054c2b870f283aebe0b501fe3a -
Branch / Tag:
refs/tags/v1.2.1 - Owner: https://github.com/BlackCortexAgent
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@70c3f02828dfd5054c2b870f283aebe0b501fe3a -
Trigger Event:
push
-
Statement type: