Skip to main content

A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.

Project description

nGPT

PyPI version License: MIT Python Versions

A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.

Table of Contents

Quick Start

# Install
pip install ngpt

# Chat with default settings
ngpt "Tell me about quantum computing"

# Generate code
ngpt --code "function to calculate the Fibonacci sequence"

# Generate and execute shell commands
ngpt --shell "list all files in the current directory"

Features

  • Dual Mode: Use as a CLI tool or import as a Python library
  • 🪶 Lightweight: Minimal dependencies (just requests)
  • 🔄 API Flexibility: Works with OpenAI, Ollama, Groq, and any compatible endpoint
  • 📊 Streaming Responses: Real-time output for better user experience
  • 🔍 Web Search: Integrated with compatible API endpoints
  • ⚙️ Multiple Configurations: Cross-platform config system supporting different profiles
  • 💻 Shell Command Generation: OS-aware command execution
  • 🧩 Clean Code Generation: Output code without markdown or explanations

Installation

pip install ngpt

Requires Python 3.8 or newer.

Usage

As a CLI Tool

# Basic chat (default mode)
ngpt "Hello, how are you?"

# Show version information
ngpt -v

# Show active configuration
ngpt --show-config

# Show all configurations
ngpt --show-config --all

# With custom options
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"

# Enable web search (if your API endpoint supports it)
ngpt --web-search "What's the latest news about AI?"

# Generate and execute shell commands (using -s or --shell flag)
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
ngpt -s "list all files in current directory"
# On Windows generates: dir
# On Linux/macOS generates: ls -la

# Generate clean code (using -c or --code flag)
# Returns only code without markdown formatting or explanations
ngpt -c "create a python function that calculates fibonacci numbers"

As a Library

from ngpt import NGPTClient, load_config

# Load the first configuration (index 0) from config file
config = load_config(config_index=0)

# Initialize the client with config
client = NGPTClient(**config)

# Or initialize with custom parameters
client = NGPTClient(
    api_key="your-key",
    base_url="http://your-endpoint",
    provider="openai",
    model="o3-mini"
)

# Chat
response = client.chat("Hello, how are you?")

# Chat with web search (if your API endpoint supports it)
response = client.chat("What's the latest news about AI?", web_search=True)

# Generate shell command
command = client.generate_shell_command("list all files")

# Generate code
code = client.generate_code("create a python function that calculates fibonacci numbers")

Advanced Library Usage

# Stream responses
for chunk in client.chat("Write a poem about Python", stream=True):
    print(chunk, end="", flush=True)

# Customize system prompt
response = client.chat(
    "Explain quantum computing",
    system_prompt="You are a quantum physics professor. Explain complex concepts simply."
)

# OS-aware shell commands
# Automatically generates appropriate commands for the current OS
command = client.generate_shell_command("find large files")
import subprocess
result = subprocess.run(command, shell=True, capture_output=True, text=True)
print(result.stdout)

# Clean code generation
# Returns only code without markdown or explanations
code = client.generate_code("function that converts Celsius to Fahrenheit")
print(code)

Configuration

Command Line Options

You can configure the client using the following options:

Option Description
--api-key API key for the service
--base-url Base URL for the API
--model Model to use
--web-search Enable web search capability
--config Path to a custom configuration file
--config-index Index of the configuration to use (default: 0)
--show-config Show configuration details and exit
--all Used with --show-config to display all configurations
-s, --shell Generate and execute shell commands
-c, --code Generate clean code output
-v, --version Show version information

Configuration File

nGPT uses a configuration file stored in the standard user config directory for your operating system:

  • Linux: ~/.config/ngpt/ngpt.conf or $XDG_CONFIG_HOME/ngpt/ngpt.conf
  • macOS: ~/Library/Application Support/ngpt/ngpt.conf
  • Windows: %APPDATA%\ngpt\ngpt.conf

The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the --config-index argument (or by default, index 0 is used).

Multiple Configurations Example (ngpt.conf)

[
  {
    "api_key": "your-openai-api-key-here",
    "base_url": "https://api.openai.com/v1/",
    "provider": "OpenAI",
    "model": "gpt-4o"
  },
  {
    "api_key": "your-groq-api-key-here",
    "base_url": "https://api.groq.com/openai/v1/",
    "provider": "Groq",
    "model": "llama3-70b-8192"
  },
  {
    "api_key": "your-ollama-key-if-needed",
    "base_url": "http://localhost:11434/v1/",
    "provider": "Ollama-Local",
    "model": "llama3"
  }
]

Configuration Priority

nGPT determines configuration values in the following order (highest priority first):

  1. Command line arguments (--api-key, --base-url, --model)
  2. Environment variables (OPENAI_API_KEY, OPENAI_BASE_URL, OPENAI_MODEL)
  3. Configuration file (selected by --config-index, defaults to index 0)
  4. Default values

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ngpt-1.1.3.tar.gz (21.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ngpt-1.1.3-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file ngpt-1.1.3.tar.gz.

File metadata

  • Download URL: ngpt-1.1.3.tar.gz
  • Upload date:
  • Size: 21.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-1.1.3.tar.gz
Algorithm Hash digest
SHA256 c8d782ebec9a0e0ca877a86abee06f5e816330c1e1fa9787f60a0f9e886f2e40
MD5 23829f39e42a0782599aa5a44ad76308
BLAKE2b-256 206017ddeaf6d1f88781a23aa67e29cb84f465383fe3f22c701fbc4881e54b68

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-1.1.3.tar.gz:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ngpt-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: ngpt-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 4932df606d9c18cd27f2bdb6736bcc7a3c7e27a159e47fa10875d0447a4f6a45
MD5 5306088caa0b2cb28c4616a255757bc5
BLAKE2b-256 9c148e29237082e83a608abbcc45bad340dc8cfb772647369a6cb0326c88dda1

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-1.1.3-py3-none-any.whl:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page