A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
Project description
nGPT
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
Features
- Dual mode: Use as a CLI tool or import as a library
- Minimal dependencies
- Customizable API endpoints and providers
- Streaming responses
- Web search capability (supported by compatible API endpoints)
- Cross-platform configuration system
- Experimental features:
- Shell command generation and execution (OS-aware)
- Code generation with clean output
Installation
pip install ngpt
Usage
As a CLI Tool
# Basic chat (default mode)
ngpt "Hello, how are you?"
# Show version information
ngpt -v
# Show active configuration
ngpt --show-config
# Show all configurations
ngpt --show-config --all
# With custom options
ngpt --api-key your-key --base-url http://your-endpoint "Hello"
# Enable web search (if your API endpoint supports it)
ngpt --web-search "What's the latest news about AI?"
# Generate and execute shell commands (using -s or --shell flag)
ngpt -s "list all files in current directory"
# Generate code (using -c or --code flag)
ngpt -c "create a python function that calculates fibonacci numbers"
As a Library
from ngpt import NGPTClient, load_config
# Load the first configuration (index 0) from config file
config = load_config(config_index=0)
# Initialize the client with config
client = NGPTClient(**config)
# Or initialize with custom parameters
client = NGPTClient(
api_key="your-key",
base_url="http://your-endpoint",
provider="openai",
model="o3-mini"
)
# Chat
response = client.chat("Hello, how are you?")
# Chat with web search (if your API endpoint supports it)
response = client.chat("What's the latest news about AI?", web_search=True)
# Generate shell command
command = client.generate_shell_command("list all files")
# Generate code
code = client.generate_code("create a python function that calculates fibonacci numbers")
Configuration
Command Line Options
You can configure the client using the following options:
--api-key: API key for the service--base-url: Base URL for the API--model: Model to use--web-search: Enable web search capability (Note: Your API endpoint must support this feature)--config: Path to a custom configuration file--config-index: Index of the configuration to use from the config file (default: 0)--show-config: Show configuration details and exit.--all: Used with--show-configto display details for all configurations.
Configuration File
nGPT uses a configuration file stored in the standard user config directory for your operating system:
- Linux:
~/.config/ngpt/ngpt.confor$XDG_CONFIG_HOME/ngpt/ngpt.conf - macOS:
~/Library/Application Support/ngpt/ngpt.conf - Windows:
%APPDATA%\ngpt\ngpt.conf
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the --config-index argument (or by default, index 0 is used).
Multiple Configurations Example (ngpt.conf)
[
{
"api_key": "your-openai-api-key-here",
"base_url": "https://api.openai.com/v1/",
"provider": "OpenAI",
"model": "gpt-4o"
},
{
"api_key": "your-groq-api-key-here",
"base_url": "https://api.groq.com/openai/v1/",
"provider": "Groq",
"model": "llama3-70b-8192"
},
{
"api_key": "your-ollama-key-if-needed",
"base_url": "http://localhost:11434/v1/",
"provider": "Ollama-Local",
"model": "llama3"
}
]
Configuration Priority
nGPT determines configuration values in the following order (highest priority first):
- Command line arguments (
--api-key,--base-url,--model) - Environment variables (
OPENAI_API_KEY,OPENAI_BASE_URL,OPENAI_MODEL) - Configuration file (selected by
--config-index, defaults to index 0) - Default values
Special Features
OS-Aware Shell Commands
Shell command generation is OS-aware, providing appropriate commands for your operating system (Windows, macOS, or Linux) and shell type (bash, powershell, etc.).
Clean Code Generation
Code generation uses an improved prompt that ensures only clean code is returned, without markdown formatting or unnecessary explanations.
Implementation Notes
This library uses direct HTTP requests instead of the OpenAI client library, allowing it to work with custom API endpoints that support additional parameters like provider and web_search. All parameters are sent directly in the request body, similar to the format shown in the curl example.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ngpt-1.1.1.tar.gz.
File metadata
- Download URL: ngpt-1.1.1.tar.gz
- Upload date:
- Size: 20.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2fb655aa92a381fe59ac86ea3430c33c6cf6cd2cdc002bcf987a59da8d4c7099
|
|
| MD5 |
6938c169d5b92294c972066043951bf4
|
|
| BLAKE2b-256 |
550a3f09b44dc717ffca39ab92f91434465984a36ef6c889576046807648b409
|
Provenance
The following attestation bundles were made for ngpt-1.1.1.tar.gz:
Publisher:
python-publish.yml on nazdridoy/ngpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ngpt-1.1.1.tar.gz -
Subject digest:
2fb655aa92a381fe59ac86ea3430c33c6cf6cd2cdc002bcf987a59da8d4c7099 - Sigstore transparency entry: 197899109
- Sigstore integration time:
-
Permalink:
nazdridoy/ngpt@e04c05685f19747296234f63d5926ee120972775 -
Branch / Tag:
refs/tags/v1.1.1 - Owner: https://github.com/nazdridoy
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e04c05685f19747296234f63d5926ee120972775 -
Trigger Event:
release
-
Statement type:
File details
Details for the file ngpt-1.1.1-py3-none-any.whl.
File metadata
- Download URL: ngpt-1.1.1-py3-none-any.whl
- Upload date:
- Size: 12.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
311f04e3bf157dd4e92996eafc4a0755c253e503756e68d35fa18089994a8cd1
|
|
| MD5 |
79e3458b2503e6c00c7e1c4c8dd7e780
|
|
| BLAKE2b-256 |
f846f7c290d8b1142e3f4f8bfc62f0c48d3fb87480b0fe879d089ff8f48ac365
|
Provenance
The following attestation bundles were made for ngpt-1.1.1-py3-none-any.whl:
Publisher:
python-publish.yml on nazdridoy/ngpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ngpt-1.1.1-py3-none-any.whl -
Subject digest:
311f04e3bf157dd4e92996eafc4a0755c253e503756e68d35fa18089994a8cd1 - Sigstore transparency entry: 197899114
- Sigstore integration time:
-
Permalink:
nazdridoy/ngpt@e04c05685f19747296234f63d5926ee120972775 -
Branch / Tag:
refs/tags/v1.1.1 - Owner: https://github.com/nazdridoy
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e04c05685f19747296234f63d5926ee120972775 -
Trigger Event:
release
-
Statement type: