Skip to main content

Swiss army knife for LLMs: powerful CLI and interactive chatbot in one package. Seamlessly work with OpenAI, Ollama, Groq, Claude, Gemini, or any OpenAI-compatible API to generate code, craft git commits, rewrite text, and execute shell commands.

Project description

nGPT

PyPI version AUR Version License: MIT Documentation

Linux Windows macOS Android

🤖 nGPT: A Swiss army knife for LLMs: powerful CLI and interactive chatbot in one package. Seamlessly work with OpenAI, Ollama, Groq, Claude, Gemini, or any OpenAI-compatible API to generate code, craft git commits, rewrite text, and execute shell commands. Fast, lightweight, and designed for both casual users and developers.

ngpt-i

Features

  • Versatile: Powerful and easy-to-use CLI tool for various AI tasks
  • 🪶 Lightweight: Minimal dependencies with everything you need included
  • 🔄 API Flexibility: Works with OpenAI, Ollama, Groq, Claude, Gemini, and any OpenAI-compatible endpoint
  • 💬 Interactive Chat: Continuous conversation with memory in modern UI
  • 📊 Streaming Responses: Real-time output for better user experience
  • 🔍 Web Search: Enhance any model with contextual information from the web, using advanced content extraction to identify the most relevant information from web pages
  • 📥 Stdin Processing: Process piped content by using {} placeholder in prompts
  • 🎨 Markdown Rendering: Beautiful formatting of markdown and code with syntax highlighting
  • Real-time Markdown: Stream responses with live updating syntax highlighting and formatting
  • ⚙️ Multiple Configurations: Cross-platform config system supporting different profiles
  • 💻 Shell Command Generation: OS-aware command execution
  • 🧠 Text Rewriting: Improve text quality while maintaining original tone and meaning
  • 🧩 Clean Code Generation: Output code without markdown or explanations
  • 📝 Rich Multiline Editor: Interactive multiline text input with syntax highlighting and intuitive controls
  • 📑 Git Commit Messages: AI-powered generation of conventional, detailed commit messages from git diffs
  • 🎭 System Prompts: Customize model behavior with custom system prompts
  • 🤖 Custom Roles: Create and use reusable AI roles for specialized tasks
  • 📃 Conversation Logging: Save your conversations to text files for later reference
  • 🔌 Modular Architecture: Well-structured codebase with clean separation of concerns
  • 🔄 Provider Switching: Easily switch between different LLM providers with a single parameter
  • 🚀 Performance Optimized: Fast response times and minimal resource usage

See the Feature Overview for more details.

Table of Contents

Installation

# Installation with pip
pip install ngpt

# Or install with uv (faster installation)
uv pip install ngpt

# Or install globally as a CLI tool (recommended for command-line usage)
uv tool install ngpt

# Arch Linux: install from AUR
paru -S ngpt

Requires Python 3.8 or newer.

For detailed installation instructions, see the Installation Guide.

Quick Start

# Chat with default settings
ngpt "Tell me about quantum computing"

# Alternatively, run as a Python module
python -m ngpt "Tell me about quantum computing"

# Start an interactive chat session with conversation memory
ngpt -i

# Return response without streaming
ngpt --no-stream "Tell me about quantum computing"

# Generate code
ngpt --code "function to calculate the Fibonacci sequence"

# Generate code with syntax highlighting
ngpt --code --prettify "function to calculate the Fibonacci sequence"

# Generate code with real-time syntax highlighting
ngpt --code --stream-prettify "function to calculate the Fibonacci sequence"

# Generate and execute shell commands
ngpt --shell "list all files in the current directory"

# Read from stdin and use the content in your prompt
echo "What is this text about?" | ngpt --pipe "Analyze the following text: {}"

# Using here-string (<<<) for quick single-line input 
ngpt --pipe {} <<< "What is the best way to learn shell redirects?"

# Using standard input redirection to process file contents
ngpt --pipe "summarise {}" < README.md

# Using here-document (<<EOF) for multiline input
ngpt --pipe {} << EOF                                              
What is the best way to learn Golang?
Provide simple hello world example.
EOF

# Create a custom role for specialized tasks
ngpt --role-config create json_generator

# Use a custom role for specific tasks
ngpt --role json_generator "Generate user data with name, email, and address" 

# Rewrite text to improve quality while preserving tone and meaning
echo "your text" | ngpt -r

# Rewrite text from a command-line argument
ngpt -r "your text to rewrite"

# Rewrite text from a file
cat file.txt | ngpt -r

# Generate AI-powered git commit messages for staged changes
ngpt -g

# Generate commit message from staged changes with a context directive
ngpt -g --preprompt "type:feat"

# Process large diffs in chunks with recursive analysis
ngpt -g --rec-chunk

# Process a diff file instead of staged changes
ngpt -g --diff /path/to/changes.diff

# Use piped diff content for commit message generation
git diff HEAD~1 | ngpt -g --pipe

# Generate a commit message with logging for debugging
ngpt -g --log commit_log.txt

# Use interactive multiline editor to enter text to rewrite
ngpt -r

# Display markdown responses with beautiful formatting
ngpt --prettify "Explain markdown syntax with examples"

# Display markdown responses with real-time formatting
ngpt --stream-prettify "Explain markdown syntax with examples"

# Use a specific markdown renderer
ngpt --prettify --renderer=rich "Create a markdown table"

# Use multiline editor for complex prompts
ngpt --text

# Use custom system prompt
ngpt --preprompt "You are a Linux expert" "How do I find large files?"

# Log your conversation to a file
ngpt --interactive --log conversation.log

# Create a temporary log file automatically
ngpt --log "Tell me about quantum computing"

# Process text from stdin using the {} placeholder
cat README.md | ngpt --pipe "Summarize this document: {}"

# Use different model providers by specifying the provider name
ngpt --provider Groq "Explain quantum computing"

# Compare outputs from different providers
ngpt --provider OpenAI "Explain quantum physics" > openai_response.txt
ngpt --provider Ollama "Explain quantum physics" > ollama_response.txt

# Show all API configurations
ngpt --show-config --all

# List available models for the active configuration
ngpt --list-models

# List models for a specific configuration (index)
ngpt --list-models --config-index 1

# List models for a specific configuration (provider)
ngpt --list-models --provider Gemini

# With custom options
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"

# Enable web search capability to enhance prompts with web information
ngpt --web-search "What's the latest news about AI?"

# Generate and execute shell commands (using -s or --shell flag)
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
ngpt -s "list all files in current directory"
# On Windows generates: dir
# On Linux/macOS generates: ls -la

# Generate code (using -c or --code flag)
ngpt -c "create a python function that calculates fibonacci numbers"

# Use multiline text editor for complex prompts (using -t or --text flag)
ngpt -t

For more examples and detailed usage, visit the CLI Usage Guide.

Usage

Command Line Options

❯ ngpt -h

usage: ngpt [-h] [-v] [--language LANGUAGE] [--config [CONFIG]] [--config-index CONFIG_INDEX] [--provider PROVIDER]
            [--remove] [--show-config] [--all] [--list-models] [--list-renderers] [--cli-config [COMMAND ...]]
            [--role-config [ACTION ...]] [--api-key API_KEY] [--base-url BASE_URL] [--model MODEL] [--web-search]
            [--pipe] [--temperature TEMPERATURE] [--top_p TOP_P] [--max_tokens MAX_TOKENS] [--log [FILE]]
            [--preprompt PREPROMPT | --role ROLE] [--no-stream | --prettify | --stream-prettify]
            [--renderer {auto,rich,glow}] [--rec-chunk] [--diff [FILE]] [--chunk-size CHUNK_SIZE]
            [--analyses-chunk-size ANALYSES_CHUNK_SIZE] [--max-msg-lines MAX_MSG_LINES]
            [--max-recursion-depth MAX_RECURSION_DEPTH] [-i | -s | -c | -t | -r | -g]
            [prompt]

nGPT - Interact with AI language models via OpenAI-compatible APIs

positional arguments::

[PROMPT]                            The prompt to send

options::

-h, --help                          show this help message and exit
-v, --version                       Show version information and exit
--language LANGUAGE                 Programming language to generate code in (for code mode)

Configuration Options::

--config [CONFIG]                   Path to a custom config file or, if no value provided, enter interactive configuration mode to create a new config
--config-index CONFIG_INDEX         Index of the configuration to use or edit (default: 0)
--provider PROVIDER                 Provider name to identify the configuration to use
--remove                            Remove the configuration at the specified index (requires --config and --config-index or --provider)
--show-config                       Show the current configuration(s) and exit
--all                               Show details for all configurations (requires --show-config)
--list-models                       List all available models for the current configuration and exit
--list-renderers                    Show available markdown renderers for use with --prettify
--cli-config [COMMAND ...]          Manage CLI configuration (set, get, unset, list, help)
--role-config [ACTION ...]          Manage custom roles (help, create, show, edit, list, remove) [role_name]

Global Options::

--api-key API_KEY                   API key for the service
--base-url BASE_URL                 Base URL for the API
--model MODEL                       Model to use
--web-search                        Enable web search capability using DuckDuckGo to enhance prompts with relevant information
--pipe                              Read from stdin and use content with prompt. Use {} in prompt as placeholder for stdin content. Can be used with any mode option except --text and --interactive
--temperature TEMPERATURE           Set temperature (controls randomness, default: 0.7)
--top_p TOP_P                       Set top_p (controls diversity, default: 1.0)
--max_tokens MAX_TOKENS             Set max response length in tokens
--log [FILE]                        Set filepath to log conversation to, or create a temporary log file if no path provided
--preprompt PREPROMPT               Set custom system prompt to control AI behavior
--role ROLE                         Use a predefined role to set system prompt (mutually exclusive with --preprompt)
--renderer {auto,rich,glow}         Select which markdown renderer to use with --prettify or --stream-prettify (auto, rich, or glow)

Output Display Options (mutually exclusive)::

--no-stream                         Return the whole response without streaming or formatting
--prettify                          Render complete response with markdown and code formatting (non-streaming)
--stream-prettify                   Stream response with real-time markdown rendering (default)

Git Commit Message Options::

--rec-chunk                         Process large diffs in chunks with recursive analysis if needed
--diff [FILE]                       Use diff from specified file instead of staged changes. If used without a path, uses the path from CLI config.
--chunk-size CHUNK_SIZE             Number of lines per chunk when chunking is enabled (default: 200)
--analyses-chunk-size ANALYSES_CHUNK_SIZE Number of lines per chunk when recursively chunking analyses (default: 200)
--max-msg-lines MAX_MSG_LINES       Maximum number of lines in commit message before condensing (default: 20)
--max-recursion-depth MAX_RECURSION_DEPTH Maximum recursion depth for commit message condensing (default: 3)

Modes (mutually exclusive)::

-i, --interactive                   Start an interactive chat session
-s, --shell                         Generate and execute shell commands
-c, --code                          Generate code
-t, --text                          Enter multi-line text input (submit with Ctrl+D)
-r, --rewrite                       Rewrite text from stdin to be more natural while preserving tone and meaning
-g, --gitcommsg                     Generate AI-powered git commit messages from staged changes or diff file

Note: For better visualization of conventional commit messages on GitHub, you can use the GitHub Commit Labels userscript, which adds colorful labels to your commits.

For a complete reference of all available options, detailed CLI examples and usage information, see the CLI Usage Guide.

Documentation

Comprehensive documentation, including usage guides and examples, is available at:

https://nazdridoy.github.io/ngpt/

Key documentation sections:

Configuration

API Key Setup

OpenAI API Key

  1. Create an account at OpenAI
  2. Navigate to API keys: https://platform.openai.com/api-keys
  3. Click "Create new secret key" and copy your API key
  4. Configure nGPT with your key:
    ngpt --config
    # Enter provider: OpenAI
    # Enter API key: your-openai-api-key
    # Enter base URL: https://api.openai.com/v1/
    # Enter model: gpt-3.5-turbo (or other model)
    

Google Gemini API Key

  1. Create or use an existing Google account
  2. Go to Google AI Studio
  3. Navigate to API keys in the left sidebar (or visit https://aistudio.google.com/app/apikey)
  4. Create an API key and copy it
  5. Configure nGPT with your key:
    ngpt --config
    # Enter provider: Gemini
    # Enter API key: your-gemini-api-key
    # Enter base URL: https://generativelanguage.googleapis.com/v1beta/openai
    # Enter model: gemini-2.0-flash
    

For more detailed information, refer to the API Key Setup documentation.

CLI Configuration

NGPT offers a CLI configuration system that allows you to set default values for command-line options. This is especially useful when you:

  • Repeatedly use the same provider or model
  • Have preferred settings for specific tasks
  • Want to create different workflows based on context

For example, setting your preferred language for code generation or temperature value means you won't have to specify these parameters each time:

❯ ngpt --cli-config help

CLI Configuration Help:
  Command syntax:
    ngpt --cli-config help                - Show this help message
    ngpt --cli-config set OPTION VALUE    - Set a default value for OPTION
    ngpt --cli-config get OPTION          - Get the current value of OPTION
    ngpt --cli-config get                 - Show all CLI configuration settings
    ngpt --cli-config unset OPTION        - Remove OPTION from configuration
    ngpt --cli-config list                - List all available options

  Available options:
    General options (all modes):
      config-index - int (default: 0) [exclusive with: provider]
      log - str 
      max_tokens - int 
      no-stream - bool (default: False) [exclusive with: prettify, stream-prettify]
      preprompt - str 
      prettify - bool (default: False) [exclusive with: no-stream, stream-prettify]
      provider - str  [exclusive with: config-index]
      renderer - str (default: auto)
      stream-prettify - bool (default: False) [exclusive with: no-stream, prettify]
      temperature - float (default: 0.7)
      top_p - float (default: 1.0)
      web-search - bool (default: False)

    Options for Code generation mode:
      language - str (default: python)

    Options for Git commit message mode:
      analyses-chunk-size - int (default: 200)
      chunk-size - int (default: 200)
      diff - str 
      max-msg-lines - int (default: 20)
      max-recursion-depth - int (default: 3)
      rec-chunk - bool (default: False)

  Example usage:
    ngpt --cli-config set language java        - Set default language to java for code generation
    ngpt --cli-config set provider Gemini      - Set Gemini as your default provider
    ngpt --cli-config set temperature 0.9      - Set default temperature to 0.9
    ngpt --cli-config set no-stream true       - Disable streaming by default
    ngpt --cli-config set recursive-chunk true - Enable recursive chunking for git commit messages
    ngpt --cli-config set diff /path/to/file.diff - Set default diff file for git commit messages
    ngpt --cli-config get temperature          - Check the current temperature setting
    ngpt --cli-config get                      - Show all current CLI settings
    ngpt --cli-config unset language           - Remove language setting

  Notes:
    - CLI configuration is stored in:
      • Linux: ~/.config/ngpt/ngpt-cli.conf
      • macOS: ~/Library/Application Support/ngpt/ngpt-cli.conf
      • Windows: %APPDATA%\ngpt\ngpt-cli.conf
    - Settings are applied based on context (e.g., language only applies to code generation mode)
    - Command-line arguments always override CLI configuration
    - Some options are mutually exclusive and will not be applied together

For more details, see the CLI Configuration Guide.

Interactive Configuration

The --config option without arguments enters interactive configuration mode, allowing you to add or edit configurations:

# Add a new configuration
ngpt --config

# Edit an existing configuration at index 1
ngpt --config --config-index 1

# Edit an existing configuration by provider name
ngpt --config --provider Gemini

# Remove a configuration at index 2
ngpt --config --remove --config-index 2

# Remove a configuration by provider name
ngpt --config --remove --provider Gemini

# Use a specific configuration by provider name
ngpt --provider OpenAI "Tell me about quantum computing"

In interactive mode:

  • When editing an existing configuration, press Enter to keep the current values
  • When creating a new configuration, press Enter to use default values
  • For security, your API key is not displayed when editing configurations
  • When removing a configuration, you'll be asked to confirm before deletion

ngpt-sh-c-a

For more details on configuring nGPT, see the Configuration Guide.

Configuration File

nGPT uses a configuration file stored in the standard user config directory for your operating system:

  • Linux: ~/.config/ngpt/ngpt.conf or $XDG_CONFIG_HOME/ngpt/ngpt.conf
  • macOS: ~/Library/Application Support/ngpt/ngpt.conf
  • Windows: %APPDATA%\ngpt\ngpt.conf

The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the --config-index argument (or by default, index 0 is used).

Multiple Configurations Example (ngpt.conf)

[
  {
    "api_key": "your-openai-api-key-here",
    "base_url": "https://api.openai.com/v1/",
    "provider": "OpenAI",
    "model": "gpt-4o"
  },
  {
    "api_key": "your-groq-api-key-here",
    "base_url": "https://api.groq.com/openai/v1/",
    "provider": "Groq",
    "model": "llama3-70b-8192"
  },
  {
    "api_key": "your-ollama-key-if-needed",
    "base_url": "http://localhost:11434/v1/",
    "provider": "Ollama-Local",
    "model": "llama3"
  }
]

For details on the configuration file format and structure, see the Configuration Guide.

Configuration Priority

nGPT determines configuration values in the following order (highest priority first):

  1. Command line arguments (--api-key, --base-url, --model, etc.)
  2. Environment variables (OPENAI_API_KEY, OPENAI_BASE_URL, OPENAI_MODEL)
  3. CLI configuration file (ngpt-cli.conf, managed with --cli-config)
  4. Main configuration file ngpt.conf or custom-config-file
  5. Default values

Real-World Demonstrations with nGPT

Let's see nGPT in action! Here are some practical ways you can use it every day:

Quick Q&A and Coding

# Get a quick explanation
ngpt "Explain the difference between threads and processes in Python"

# Generate code with real-time syntax highlighting
ngpt --code --stream-prettify "Write a Python function to reverse a linked list"

With the --code flag, nGPT gives you clean code without explanations or markdown, just what you need to copy and paste into your project. The --stream-prettify option shows real-time syntax highlighting as the code comes in.

Shell Command Generation (OS-Aware)

# Let nGPT generate the correct command for your OS
ngpt --shell "list all files in the current directory including hidden ones"
# On Linux/macOS: ls -la
# On Windows: dir /a

One of my favorite features! No more Googling obscure command flags, nGPT generates the right command for your operating system. It'll even execute it for you if you approve.

ngpt-s-c

Text Rewriting and Summarization

# Pipe text to rewrite it (e.g., improve clarity)
echo "This is a rough draft of my email." | ngpt -r

# Summarize a file using the pipe placeholder
cat long-article.txt | ngpt --pipe "Summarize this document concisely: {}"

The text rewriting feature is perfect for quickly improving documentation, emails, or reports. And with pipe placeholders, you can feed in content from files or other commands.

Git Commit Message Generation

# Stage your changes
git add .

# Let nGPT generate a conventional commit message based on the diff
ngpt -g

# Generate git commit message from a diff file
ngpt -g --diff changes.diff

This is a huge time-saver. nGPT analyzes your git diff and generates a properly formatted conventional commit message that actually describes what you changed. No more staring at the blank commit message prompt!

ngpt-g

Custom AI Roles

# Create a specialized role for JSON generation
ngpt --role-config create json_generator

# Use the custom role to generate structured data
ngpt --role json_generator "Generate random user profile data"
{
  "id": "a1b2c3d4-e5f6-7890-1234-567890abcdef",
  "firstName": "Aurora",
  "lastName": "Reynolds",
  "email": "aurora.reynolds@example.com",
  "phone": "+1-555-0101",
  "address": {
    "street": "123 Main St",
    "city": "Anytown",
    "state": "CA",
    "zipCode": "90210"
  },
  "birthDate": "1990-07-15",
  "registrationDate": "2022-01-20",
  "isActive": true,
  "roles": [
    "user",
    "premium"
  ]
}

Custom roles let you define specialized AI personas that you can reuse across different prompts, making it easy to get consistent responses for specific tasks.

Web Search Integration

# Ask questions that require up-to-date information
ngpt --web-search "What's the latest news about AI regulation?"

The --web-search flag lets nGPT consult the web for recent information, making it useful for questions about current events or topics that might have changed since the AI's training data cutoff.

ngpt-w

Real-World Integration Examples

Let's look at how nGPT can fit into your everyday workflow with some practical examples:

Developer Workflow

As a developer, I use nGPT throughout my day:

Morning code review:

# Get explanations of complex code
git show | ngpt --pipe "Explain what this code change does and any potential issues: {}"

Debugging help:

# Help understand a cryptic error message
npm run build 2>&1 | grep Error | ngpt --pipe "What does this error mean and how can I fix it: {}"

Documentation generation:

# Generate JSDoc comments for functions
cat src/utils.js | ngpt --pipe "Write proper JSDoc comments for these functions: {}"

Commit messages:

# After finishing a feature
git add .
ngpt -g

Writer's Assistant

For content creators and writers:

Overcoming writer's block:

ngpt "Give me 5 different angles to approach an article about sustainable technology"

Editing assistance:

cat draft.md | ngpt -r

Research summaries:

curl -s https://example.com/research-paper.html | ngpt --pipe "Summarize the key findings from this research: {}"

System Administrator

For sysadmins and DevOps folks:

Generating complex commands:

ngpt -s "find all log files larger than 100MB that haven't been modified in the last 30 days"

Creating configuration files*:

ngpt --code "Create a Docker Compose file for a Redis, PostgreSQL, and Node.js application"

Troubleshooting systems:

dmesg | tail -50 | ngpt --pipe "Explain what might be causing the issues based on these system logs: {}"

Contributing

We welcome contributions to nGPT! Whether it's bug fixes, feature additions, or documentation improvements, your help is appreciated.

To contribute:

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/your-feature-name
  3. Make your changes
  4. Commit with clear messages following conventional commit guidelines
  5. Push to your fork and submit a pull request

Please check the CONTRIBUTING.md file for detailed guidelines on code style, pull request process, and development setup.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

This version

3.9.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ngpt-3.9.0.tar.gz (620.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ngpt-3.9.0-py3-none-any.whl (94.9 kB view details)

Uploaded Python 3

File details

Details for the file ngpt-3.9.0.tar.gz.

File metadata

  • Download URL: ngpt-3.9.0.tar.gz
  • Upload date:
  • Size: 620.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-3.9.0.tar.gz
Algorithm Hash digest
SHA256 811bbedd989398bc9effc03bfdf0791710b050f548c6f0f12047fd55c6f1e4c6
MD5 7be5edb5fe0e99f32f0d4ce395ff952a
BLAKE2b-256 fc1b7723ae9d008eb64acc257cc0670d4c721988ef9d1c9203be4921835bcebb

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-3.9.0.tar.gz:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ngpt-3.9.0-py3-none-any.whl.

File metadata

  • Download URL: ngpt-3.9.0-py3-none-any.whl
  • Upload date:
  • Size: 94.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for ngpt-3.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8a6c8348e984cd8f2f4989a073e29ee1a2ab778f162c2fa37db00475f8ddded9
MD5 fb5225ed210789c531c1842489a01fcf
BLAKE2b-256 b0dec7b9dfc51229221cbb52fed43d1f0bab418d1076e84c16983346080f9697

See more details on using hashes here.

Provenance

The following attestation bundles were made for ngpt-3.9.0-py3-none-any.whl:

Publisher: python-publish.yml on nazdridoy/ngpt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page