Terminal-based GPT CLI with memory, markdown, and stream support
Project description
BLACKCORTEX GPT CLI
A Conversational Assistant for the Terminal
A terminal-based GPT assistant powered by the OpenAI API, developed by Konijima and now maintained under the BlackCortex organization.
Features
- Persistent memory across sessions with summarization
- Streaming output support
- Command history and logging
- Configurable prompt, model, and temperature
.env-based secure configuration
Installation
Requires Python 3.8+.
Using PyPI
pip install blackcortex-gpt-cli
Using pipx (recommended)
pipx install blackcortex-gpt-cli
From GitHub
pip install git+https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
# or with pipx
pipx install git+https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
Development Installation
git clone https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
cd blackcortex-gpt-cli
make dev
CLI Usage
After installation, use the gpt command globally.
Positional Arguments
| Argument | Description |
|---|---|
input_data |
Input text for one-shot command processing. |
Options
-h, --help: Show this help message and exit
Session
-ch, --clear-history: Clear prompt history-cl, --clear-log: Clear the conversation log-cm, --clear-memory: Clear context memory-l, --log: Display conversation log
Configuration
-e, --env: Open configuration file-k [API_KEY], --set-key [API_KEY]: Set or update OpenAI API key (prompt if value omitted)
Output
-md {true,false}, --markdown {true,false}: Control Markdown formatting in responses ('true' to enable, 'false' to disable)-s {true,false}, --stream {true,false}: Control streaming of assistant responses ('true' to enable, 'false' to disable)
System
-p, --ping: Test OpenAI API connectivity-x, --uninstall: Uninstall the CLI tool-u, --update: Update the CLI tool
General
-v, --version: Show version and exit
Environment Configuration
The GPT CLI loads settings from two locations:
.envfile in the current working directory (if present)~/.gpt-cli/.env(default persistent configuration)
You can configure model behavior, memory, logging, and streaming options.
Sample .env File
OPENAI_API_KEY=your-api-key-here
MODEL=gpt-4o
SUMMARY_MODEL=gpt-3.5-turbo
DEFAULT_PROMPT="You are a helpful assistant."
TEMPERATURE=0.7
MAX_TOKENS=4096
MEMORY_PATH=/custom/path/memory.json
HISTORY_PATH=/custom/path/history
MEMORY_LIMIT=10
MAX_SUMMARY_TOKENS=2048
LOG_FILE=/custom/path/gpt.log
LOG_LEVEL=INFO
LOG_TO_CONSOLE=true
MARKDOWN_ENABLED=true
STREAM_ENABLED=false
Use
gpt --envto open and edit the.envfile in your terminal editor.
Memory System
Memory includes:
- Rolling conversation summary
- The 10 most recent messages
Older messages are summarized once the limit is reached. Use --clear-memory to clear memory.
Troubleshooting
- Missing API key: Check
.envforOPENAI_API_KEY - Client init failed: Verify internet and credentials
- Token limit exceeded: Reduce input size or use summarization
Interactive Example Output
You: Tell me a joke about databases
Assistant: Why did the database break up with the spreadsheet?
Because it couldn't handle the rows of emotions.
────────────────────────────────────────────────────────
Contributing
We welcome all contributions!
Quickstart for Development
git clone https://github.com/BlackCortexAgent/blackcortex-gpt-cli.git
cd blackcortex-gpt-cli
make dev
Run tests:
make test # uses virtualenv (.venv)
# or use 'make ci-release' if running outside .venv
Lint and format:
make lint
make format
Use make check to lint, test, build, and validate in .venv.
Use make ci-release for system Python (e.g., CI/CD pipelines).
See CONTRIBUTING.md for full details.
License
This project is licensed under the MIT License, an OSI-approved open source license that permits the following:
- ✅ Free use for personal, academic, or commercial purposes
- ✅ Permission to modify, merge, publish, and distribute the software
- ✅ Usage with or without attribution (attribution encouraged but not required)
- ✅ No warranty is provided — use at your own risk
Credits
Originally created by Konijima, now maintained by the BlackCortex team.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file blackcortex_gpt_cli-1.3.1.tar.gz.
File metadata
- Download URL: blackcortex_gpt_cli-1.3.1.tar.gz
- Upload date:
- Size: 23.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dba4e40f560b662df36b746ca1e7182ba31d686e30cd9a121afdefb1e9e029c9
|
|
| MD5 |
e7694bd2a12e4466d28d40eb42a88074
|
|
| BLAKE2b-256 |
5ed4f19cdeb3cae5a409d6f2a523a15e4d7c8574fffb320f462918a359115ae5
|
Provenance
The following attestation bundles were made for blackcortex_gpt_cli-1.3.1.tar.gz:
Publisher:
publish.yml on BlackCortexAgent/blackcortex-gpt-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
blackcortex_gpt_cli-1.3.1.tar.gz -
Subject digest:
dba4e40f560b662df36b746ca1e7182ba31d686e30cd9a121afdefb1e9e029c9 - Sigstore transparency entry: 200334511
- Sigstore integration time:
-
Permalink:
BlackCortexAgent/blackcortex-gpt-cli@9bb7ec734f8c429b722b2802ca3ab3feb8f6aae3 -
Branch / Tag:
refs/tags/v1.3.1 - Owner: https://github.com/BlackCortexAgent
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@9bb7ec734f8c429b722b2802ca3ab3feb8f6aae3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file blackcortex_gpt_cli-1.3.1-py3-none-any.whl.
File metadata
- Download URL: blackcortex_gpt_cli-1.3.1-py3-none-any.whl
- Upload date:
- Size: 30.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
310b4c827a861be90fb9880efd91d0c130626a4f6900dc3cca58d86b39eb6b73
|
|
| MD5 |
037f48941aa741482ab091898d3f898b
|
|
| BLAKE2b-256 |
0450484fd44d8e7e5a61089db13112ce5bd1907f704999cb9901095828965db6
|
Provenance
The following attestation bundles were made for blackcortex_gpt_cli-1.3.1-py3-none-any.whl:
Publisher:
publish.yml on BlackCortexAgent/blackcortex-gpt-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
blackcortex_gpt_cli-1.3.1-py3-none-any.whl -
Subject digest:
310b4c827a861be90fb9880efd91d0c130626a4f6900dc3cca58d86b39eb6b73 - Sigstore transparency entry: 200334512
- Sigstore integration time:
-
Permalink:
BlackCortexAgent/blackcortex-gpt-cli@9bb7ec734f8c429b722b2802ca3ab3feb8f6aae3 -
Branch / Tag:
refs/tags/v1.3.1 - Owner: https://github.com/BlackCortexAgent
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@9bb7ec734f8c429b722b2802ca3ab3feb8f6aae3 -
Trigger Event:
push
-
Statement type: