A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
Project description
nGPT
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
Table of Contents
Quick Start
# Install
pip install ngpt
# Chat with default settings
ngpt "Tell me about quantum computing"
# Generate code
ngpt --code "function to calculate the Fibonacci sequence"
# Generate and execute shell commands
ngpt --shell "list all files in the current directory"
Features
- ✅ Dual Mode: Use as a CLI tool or import as a Python library
- 🪶 Lightweight: Minimal dependencies (just
requests) - 🔄 API Flexibility: Works with OpenAI, Ollama, Groq, and any compatible endpoint
- 📊 Streaming Responses: Real-time output for better user experience
- 🔍 Web Search: Integrated with compatible API endpoints
- ⚙️ Multiple Configurations: Cross-platform config system supporting different profiles
- 💻 Shell Command Generation: OS-aware command execution
- 🧩 Clean Code Generation: Output code without markdown or explanations
Installation
pip install ngpt
Requires Python 3.8 or newer.
Usage
As a CLI Tool
# Basic chat (default mode)
ngpt "Hello, how are you?"
# Show version information
ngpt -v
# Show active configuration
ngpt --show-config
# Show all configurations
ngpt --show-config --all
# With custom options
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"
# Enable web search (if your API endpoint supports it)
ngpt --web-search "What's the latest news about AI?"
# Generate and execute shell commands (using -s or --shell flag)
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
ngpt -s "list all files in current directory"
# On Windows generates: dir
# On Linux/macOS generates: ls -la
# Generate clean code (using -c or --code flag)
# Returns only code without markdown formatting or explanations
ngpt -c "create a python function that calculates fibonacci numbers"
As a Library
from ngpt import NGPTClient, load_config
# Load the first configuration (index 0) from config file
config = load_config(config_index=0)
# Initialize the client with config
client = NGPTClient(**config)
# Or initialize with custom parameters
client = NGPTClient(
api_key="your-key",
base_url="http://your-endpoint",
provider="openai",
model="o3-mini"
)
# Chat
response = client.chat("Hello, how are you?")
# Chat with web search (if your API endpoint supports it)
response = client.chat("What's the latest news about AI?", web_search=True)
# Generate shell command
command = client.generate_shell_command("list all files")
# Generate code
code = client.generate_code("create a python function that calculates fibonacci numbers")
Advanced Library Usage
# Stream responses
for chunk in client.chat("Write a poem about Python", stream=True):
print(chunk, end="", flush=True)
# Customize system prompt
response = client.chat(
"Explain quantum computing",
system_prompt="You are a quantum physics professor. Explain complex concepts simply."
)
# OS-aware shell commands
# Automatically generates appropriate commands for the current OS
command = client.generate_shell_command("find large files")
import subprocess
result = subprocess.run(command, shell=True, capture_output=True, text=True)
print(result.stdout)
# Clean code generation
# Returns only code without markdown or explanations
code = client.generate_code("function that converts Celsius to Fahrenheit")
print(code)
Configuration
Command Line Options
You can configure the client using the following options:
| Option | Description |
|---|---|
--api-key |
API key for the service |
--base-url |
Base URL for the API |
--model |
Model to use |
--web-search |
Enable web search capability |
--config |
Path to a custom configuration file or, when used without a value, enters interactive configuration mode |
--config-index |
Index of the configuration to use (default: 0) |
--show-config |
Show configuration details and exit |
--all |
Used with --show-config to display all configurations |
-s, --shell |
Generate and execute shell commands |
-c, --code |
Generate clean code output |
-v, --version |
Show version information |
Interactive Configuration
The --config option without arguments enters interactive configuration mode, allowing you to add or edit configurations:
# Add a new configuration
ngpt --config
# Edit an existing configuration at index 1
ngpt --config --config-index 1
In interactive mode:
- When editing an existing configuration, press Enter to keep the current values
- When creating a new configuration, press Enter to use default values
- For security, your API key is not displayed when editing configurations
Configuration File
nGPT uses a configuration file stored in the standard user config directory for your operating system:
- Linux:
~/.config/ngpt/ngpt.confor$XDG_CONFIG_HOME/ngpt/ngpt.conf - macOS:
~/Library/Application Support/ngpt/ngpt.conf - Windows:
%APPDATA%\ngpt\ngpt.conf
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the --config-index argument (or by default, index 0 is used).
Multiple Configurations Example (ngpt.conf)
[
{
"api_key": "your-openai-api-key-here",
"base_url": "https://api.openai.com/v1/",
"provider": "OpenAI",
"model": "gpt-4o"
},
{
"api_key": "your-groq-api-key-here",
"base_url": "https://api.groq.com/openai/v1/",
"provider": "Groq",
"model": "llama3-70b-8192"
},
{
"api_key": "your-ollama-key-if-needed",
"base_url": "http://localhost:11434/v1/",
"provider": "Ollama-Local",
"model": "llama3"
}
]
Configuration Priority
nGPT determines configuration values in the following order (highest priority first):
- Command line arguments (
--api-key,--base-url,--model) - Environment variables (
OPENAI_API_KEY,OPENAI_BASE_URL,OPENAI_MODEL) - Configuration file (selected by
--config-index, defaults to index 0) - Default values
Contributing
We welcome contributions to nGPT! Whether it's bug fixes, feature additions, or documentation improvements, your help is appreciated.
To contribute:
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature-name - Make your changes
- Commit with clear messages following conventional commit guidelines
- Push to your fork and submit a pull request
Please check the CONTRIBUTING.md file for detailed guidelines on code style, pull request process, and development setup.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ngpt-1.1.4.tar.gz.
File metadata
- Download URL: ngpt-1.1.4.tar.gz
- Upload date:
- Size: 25.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d4a1867724d7a5754066a874d6184c7e5913ccf0fc482e0458bf8ef7556874be
|
|
| MD5 |
262428524ab04cd7d8f3b1a2c0a6773f
|
|
| BLAKE2b-256 |
ca3e46c064cef984775fe1130baa5559564bb618d9970389634b830e287babbe
|
Provenance
The following attestation bundles were made for ngpt-1.1.4.tar.gz:
Publisher:
python-publish.yml on nazdridoy/ngpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ngpt-1.1.4.tar.gz -
Subject digest:
d4a1867724d7a5754066a874d6184c7e5913ccf0fc482e0458bf8ef7556874be - Sigstore transparency entry: 198364253
- Sigstore integration time:
-
Permalink:
nazdridoy/ngpt@1267589da66db90099a2d200e8d39834bf1b2428 -
Branch / Tag:
refs/tags/v1.1.4 - Owner: https://github.com/nazdridoy
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@1267589da66db90099a2d200e8d39834bf1b2428 -
Trigger Event:
release
-
Statement type:
File details
Details for the file ngpt-1.1.4-py3-none-any.whl.
File metadata
- Download URL: ngpt-1.1.4-py3-none-any.whl
- Upload date:
- Size: 13.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0fae1737ff02ac8c66fb4db5b2cec0c816f8bbb2e5a73f797baba0664117fdeb
|
|
| MD5 |
d25cd58165e556fdc8eea2c955de6d67
|
|
| BLAKE2b-256 |
7eb8c21ff10ebde35f661d24244c31d3280d8500b2c948a7a5e94ad78a1a1dff
|
Provenance
The following attestation bundles were made for ngpt-1.1.4-py3-none-any.whl:
Publisher:
python-publish.yml on nazdridoy/ngpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ngpt-1.1.4-py3-none-any.whl -
Subject digest:
0fae1737ff02ac8c66fb4db5b2cec0c816f8bbb2e5a73f797baba0664117fdeb - Sigstore transparency entry: 198364259
- Sigstore integration time:
-
Permalink:
nazdridoy/ngpt@1267589da66db90099a2d200e8d39834bf1b2428 -
Branch / Tag:
refs/tags/v1.1.4 - Owner: https://github.com/nazdridoy
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@1267589da66db90099a2d200e8d39834bf1b2428 -
Trigger Event:
release
-
Statement type: