Skip to main content

A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.

Project description

nGPT

PyPI version License: MIT Python Versions Documentation

A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.

Table of Contents

Quick Start

# Install
pip install ngpt

# Chat with default settings
ngpt "Tell me about quantum computing"

# Start an interactive chat session with conversation memory
ngpt -i

# Return response without streaming
ngpt -n "Tell me about quantum computing"

# Generate code
ngpt --code "function to calculate the Fibonacci sequence"

# Generate code with syntax highlighting
ngpt --code --prettify "function to calculate the Fibonacci sequence"

# Generate and execute shell commands
ngpt --shell "list all files in the current directory"

# Display markdown responses with beautiful formatting
ngpt --prettify "Explain markdown syntax with examples"

# Use a specific markdown renderer
ngpt --prettify --renderer=rich "Create a markdown table"

# Use multiline editor for complex prompts
ngpt --text

# Use custom system prompt
ngpt --preprompt "You are a Linux expert" "How do I find large files?"

# Log your conversation to a file
ngpt --interactive --log conversation.log

For more examples and detailed usage, visit the CLI Usage Guide.

Features

  • Dual Mode: Use as a CLI tool or import as a Python library
  • 🪶 Lightweight: Minimal dependencies (just requests)
  • 🔄 API Flexibility: Works with OpenAI, Ollama, Groq, and any compatible endpoint
  • 💬 Interactive Chat: Continuous conversation with memory in modern UI
  • 📊 Streaming Responses: Real-time output for better user experience
  • 🔍 Web Search: Integrated with compatible API endpoints
  • 🎨 Markdown Rendering: Beautiful formatting of markdown and code with syntax highlighting
  • ⚙️ Multiple Configurations: Cross-platform config system supporting different profiles
  • 💻 Shell Command Generation: OS-aware command execution
  • 🧩 Clean Code Generation: Output code without markdown or explanations
  • 📝 Rich Multiline Editor: Interactive multiline text input with syntax highlighting and intuitive controls
  • 🎭 System Prompts: Customize model behavior with custom system prompts
  • 📃 Conversation Logging: Save your conversations to text files for later reference

See the Feature Overview for more details.

Documentation

Comprehensive documentation, including API reference, usage guides, and examples, is available at:

https://nazdridoy.github.io/ngpt/

Key documentation sections:

Installation

pip install ngpt

Requires Python 3.8 or newer.

For detailed installation instructions, see the Installation Guide.

Usage

As a CLI Tool

# Basic chat (default mode)
ngpt "Hello, how are you?"

# Interactive chat session with conversation history
ngpt -i

# Log conversation to a file
ngpt --interactive --log conversation.log

# Use custom system prompt to guide AI behavior
ngpt --preprompt "You are a Python programming tutor" "Explain decorators"

# Show version information
ngpt -v

# Show active configuration
ngpt --show-config

# Show all configurations
ngpt --show-config --all

# List available models for the active configuration
ngpt --list-models

# List models for a specific configuration
ngpt --list-models --config-index 1

# With custom options
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"

# Enable web search (if your API endpoint supports it)
ngpt --web-search "What's the latest news about AI?"

# Generate and execute shell commands (using -s or --shell flag)
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
ngpt -s "list all files in current directory"
# On Windows generates: dir
# On Linux/macOS generates: ls -la

# Generate clean code (using -c or --code flag)
# Returns only code without markdown formatting or explanations
ngpt -c "create a python function that calculates fibonacci numbers"

# Use multiline text editor for complex prompts (using -t or --text flag)
# Opens an interactive editor with syntax highlighting and intuitive controls
ngpt -t

For more CLI examples and detailed usage information, see the CLI Usage Guide.

As a Library

from ngpt import NGPTClient, load_config

# Load the first configuration (index 0) from config file
config = load_config(config_index=0)

# Initialize the client with config
client = NGPTClient(**config)

# Or initialize with custom parameters
client = NGPTClient(
    api_key="your-key",
    base_url="http://your-endpoint",
    provider="openai",
    model="o3-mini"
)

# Chat
response = client.chat("Hello, how are you?")

# Chat with web search (if your API endpoint supports it)
response = client.chat("What's the latest news about AI?", web_search=True)

# Generate shell command
command = client.generate_shell_command("list all files")

# Generate code
code = client.generate_code("create a python function that calculates fibonacci numbers")

For more library examples and advanced usage, see the Library Usage Guide.

Advanced Library Usage

# Stream responses
for chunk in client.chat("Write a poem about Python", stream=True):
    print(chunk, end="", flush=True)

# Customize system prompt
response = client.chat(
    "Explain quantum computing",
    system_prompt="You are a quantum physics professor. Explain complex concepts simply."
)

# OS-aware shell commands
# Automatically generates appropriate commands for the current OS
command = client.generate_shell_command("find large files")
import subprocess
result = subprocess.run(command, shell=True, capture_output=True, text=True)
print(result.stdout)

# Clean code generation
# Returns only code without markdown or explanations
code = client.generate_code("function that converts Celsius to Fahrenheit")
print(code)

For advanced usage patterns and integrations, check out the Advanced Examples.

Configuration

Command Line Options

You can configure the client using the following options:

Option Description
--api-key API key for the service
--base-url Base URL for the API
--model Model to use
--list-models List all available models for the selected configuration (can be combined with --config-index)
--web-search Enable web search capability
-n, --no-stream Return the whole response without streaming
--temperature Set temperature (controls randomness, default: 0.7)
--top_p Set top_p (controls diversity, default: 1.0)
--max_tokens Set maximum response length in tokens
--preprompt Set custom system prompt to control AI behavior
--log Set filepath to log conversation to (for interactive modes)
--prettify Render markdown responses and code with syntax highlighting
--renderer Select which markdown renderer to use with --prettify (auto, rich, or glow)
--list-renderers Show available markdown renderers for use with --prettify
--config Path to a custom configuration file or, when used without a value, enters interactive configuration mode
--config-index Index of the configuration to use (default: 0)
--provider Provider name to identify the configuration to use (alternative to --config-index)
--remove Remove the configuration at the specified index (requires --config and --config-index)
--show-config Show configuration details and exit
--all Used with --show-config to display all configurations
-i, --interactive Start an interactive chat session with stylish UI, conversation history, and special commands
-s, --shell Generate and execute shell commands
-c, --code Generate clean code output
-t, --text Open interactive multiline editor for complex prompts
-v, --version Show version information

For a complete reference of all available options, see the CLI Usage Guide.

Interactive Configuration

The --config option without arguments enters interactive configuration mode, allowing you to add or edit configurations:

# Add a new configuration
ngpt --config

# Edit an existing configuration at index 1
ngpt --config --config-index 1

# Edit an existing configuration by provider name
ngpt --config --provider Gemini

# Remove a configuration at index 2
ngpt --config --remove --config-index 2

# Remove a configuration by provider name
ngpt --config --remove --provider Gemini

# Use a specific configuration by provider name
ngpt --provider OpenAI "Tell me about quantum computing"

In interactive mode:

  • When editing an existing configuration, press Enter to keep the current values
  • When creating a new configuration, press Enter to use default values
  • For security, your API key is not displayed when editing configurations
  • When removing a configuration, you'll be asked to confirm before deletion

For more details on configuring nGPT, see the Configuration Guide.

Configuration File

nGPT uses a configuration file stored in the standard user config directory for your operating system:

  • Linux: ~/.config/ngpt/ngpt.conf or $XDG_CONFIG_HOME/ngpt/ngpt.conf
  • macOS: ~/Library/Application Support/ngpt/ngpt.conf
  • Windows: %APPDATA%\ngpt\ngpt.conf

The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the --config-index argument (or by default, index 0 is used).

Multiple Configurations Example (ngpt.conf)

[
  {
    "api_key": "your-openai-api-key-here",
    "base_url": "https://api.openai.com/v1/",
    "provider": "OpenAI",
    "model": "gpt-4o"
  },
  {
    "api_key": "your-groq-api-key-here",
    "base_url": "https://api.groq.com/openai/v1/",
    "provider": "Groq",
    "model": "llama3-70b-8192"
  },
  {
    "api_key": "your-ollama-key-if-needed",
    "base_url": "http://localhost:11434/v1/",
    "provider": "Ollama-Local",
    "model": "llama3"
  }
]

For details on the configuration file format and structure, see the Configuration Guide.

Configuration Priority

nGPT determines configuration values in the following order (highest priority first):

  1. Command line arguments (--api-key, --base-url, --model)
  2. Environment variables (OPENAI_API_KEY, OPENAI_BASE_URL, OPENAI_MODEL)
  3. Configuration file (selected by --config-index, defaults to index 0)
  4. Default values

Contributing

We welcome contributions to nGPT! Whether it's bug fixes, feature additions, or documentation improvements, your help is appreciated.

To contribute:

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/your-feature-name
  3. Make your changes
  4. Commit with clear messages following conventional commit guidelines
  5. Push to your fork and submit a pull request

Please check the CONTRIBUTING.md file for detailed guidelines on code style, pull request process, and development setup.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ngpt-2.5.0.tar.gz (67.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ngpt-2.5.0-py3-none-any.whl (26.9 kB view details)

Uploaded Python 3

File details

Details for the file ngpt-2.5.0.tar.gz.

File metadata

  • Download URL: ngpt-2.5.0.tar.gz
  • Upload date:
  • Size: 67.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-2.5.0.tar.gz
Algorithm Hash digest
SHA256 bed9d10caa2d0837555470e2429ed685f733f7ec82de4bebec0285f296fedcc7
MD5 da03ec5d043fd44b5f9a7f797e42d9e0
BLAKE2b-256 f72eaa6a30c28276c36d9c4a166366eea590eaaae928471e2d25e7428a5dc943

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-2.5.0.tar.gz:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ngpt-2.5.0-py3-none-any.whl.

File metadata

  • Download URL: ngpt-2.5.0-py3-none-any.whl
  • Upload date:
  • Size: 26.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-2.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7ece3bab7bb1fbe8f373e3954ca638fca2fa2150958b589a304a23b1c26c259
MD5 44d8e7cfd4d5961e1e5f4998b70d8202
BLAKE2b-256 c04605f17a09ce112142fe02246342f15d5d6f882bed31d5fa862a4b44bdd9c6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-2.5.0-py3-none-any.whl:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page