Skip to main content

AI-powered Git assistant with privacy-first design. Choose between local (Ollama/Offline) or cloud AI to automate commits, PRs, and changelogs.

Project description

GitWise: AI-Powered Git Workflow Assistant

PyPI version Python versions Documentation

Stop writing commit messages and PR descriptions by hand. Let AI do it for you.

GitWise transforms your Git workflow with intelligent AI assistance - from perfect commit messages to comprehensive PR descriptions, all while keeping your code private with local AI models.

โœจ See the Difference

Before GitWise (Manual workflow):

git add .
git commit -m "fix stuff"  # ๐Ÿ˜ฌ Vague, unhelpful
git push
# Write PR description manually... takes 10+ minutes

After GitWise (Interactive AI workflow):

gitwise add .
# ๐Ÿค– Interactive: Shows changes โ†’ Generates commit โ†’ Pushes โ†’ Creates PR
# Complete workflow in one command with AI assistance at each step

Perfect commits and PRs in seconds, not minutes.

๐Ÿš€ Quick Start

# 1. Install
pip install pygitwise

# 2. Initialize (one-time setup)
gitwise init

# 3. Use it like Git, but smarter
gitwise add .       # ๐Ÿ”„ Interactive: stage โ†’ commit โ†’ push โ†’ PR (full workflow)
gitwise commit      # ๐Ÿค– AI-generated Conventional Commits  
gitwise merge       # ๐Ÿง  Smart merge with AI conflict resolution
gitwise pr          # ๐Ÿ“ Detailed PR with auto-labels & checklists

That's it! Your commits now follow Conventional Commits, your PRs have detailed descriptions, and everything is generated from your actual code changes.

๐ŸŽฏ Why GitWise?

๐Ÿ”„ Complete Workflow: One command does stage โ†’ commit โ†’ push โ†’ PR

โšก Lightning Fast: 15-second full workflow vs 10+ minute manual process

๐Ÿง  Intelligent: Auto-groups commits, resolves conflicts, generates perfect PRs

๐Ÿ”’ Privacy-First: Local AI models (Ollama) - your code never leaves your machine

๐Ÿ› ๏ธ Familiar: Works exactly like Git, just smarter

๐Ÿค– AI Backend Options

Backend Privacy Quality Speed Best For
Ollama (Local) ๐ŸŸข Complete ๐ŸŸข High ๐ŸŸข Fast Privacy-focused developers
Online (GPT-4/Claude) ๐ŸŸก API calls ๐ŸŸข Highest ๐ŸŸข Instant Latest AI capabilities

Choose local for privacy, online for cutting-edge AI. Switch anytime with gitwise init.

๐Ÿ“ฆ Installation

Option 1: Quick Install

pip install pygitwise
gitwise init

Option 2: Local AI (Recommended)

# Install Ollama for local AI
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3

# Install GitWise
pip install pygitwise
gitwise init  # Select Ollama when prompted

Option 3: Virtual Environment (Best Practice)

python3 -m venv gitwise-env
source gitwise-env/bin/activate
pip install pygitwise
gitwise init

Features:

  • Runs 100% locally on your machine
  • No internet required after model download
  • Easy model switching (ollama pull codellama, ollama pull mistral)
  • High-quality models (Llama 3, Mistral, CodeLlama, etc.)
  • Zero cost after initial setup

Configuration:

export GITWISE_LLM_BACKEND=ollama
export OLLAMA_MODEL=llama3  # or codellama, mistral, etc.

2. ๐Ÿ  Offline Mode

Best for: Maximum privacy, air-gapped environments, or when Ollama isn't available.

# Install with offline support
pip install "pygitwise[offline]"

# Configure GitWise
gitwise init
# Select: Offline (built-in model)

Features:

  • Runs 100% locally with bundled model
  • No external dependencies
  • Works in air-gapped environments
  • Smaller, faster models (TinyLlama by default)
  • Automatic fallback when Ollama unavailable

Configuration:

export GITWISE_LLM_BACKEND=offline
export GITWISE_OFFLINE_MODEL="TinyLlama/TinyLlama-1.1B-Chat-v1.0"  # optional

3. ๐ŸŒ Online Mode (OpenRouter)

Best for: Access to cutting-edge models (GPT-4, Claude) and highest quality outputs.

# Get your API key from https://openrouter.ai/
export OPENROUTER_API_KEY="your_api_key"

# Configure GitWise
gitwise init
# Select: Online (OpenRouter API)
# Enter your API key when prompted

Features:

  • Access to latest AI models (GPT-4, Claude 3, etc.)
  • Highest quality outputs
  • No local GPU required
  • Pay-per-use pricing
  • Internet connection required

Configuration:

export GITWISE_LLM_BACKEND=online
export OPENROUTER_API_KEY="your_api_key"
export OPENROUTER_MODEL="anthropic/claude-3-haiku"  # optional

4. โšก Direct LLM Provider Mode

Best for: Using your preferred LLM provider (OpenAI, Anthropic, Google Gemini) directly with your own API keys.

GitWise now offers direct integration with major LLM providers, allowing you to use your existing accounts and preferred models.

Supported Providers:

  • OpenAI: Access models like GPT-4, GPT-3.5-turbo, etc.
  • Anthropic: Access Claude models like Claude 3 Opus, Sonnet, Haiku.
  • Google Gemini: Access Gemini models like Gemini Pro.

Configuration:

To use a direct provider, set the GITWISE_LLM_BACKEND environment variable to openai, anthropic, or google_gemini, and provide the respective API key.

OpenAI:

export GITWISE_LLM_BACKEND=openai
export OPENAI_API_KEY="your_openai_api_key"
export GITWISE_OPENAI_MODEL="gpt-4" # Optional, defaults to a recommended model

Anthropic:

export GITWISE_LLM_BACKEND=anthropic
export ANTHROPIC_API_KEY="your_anthropic_api_key"
export GITWISE_ANTHROPIC_MODEL="claude-3-opus-20240229" # Optional

Google Gemini:

export GITWISE_LLM_BACKEND=google_gemini
export GOOGLE_API_KEY="your_google_api_key"
export GITWISE_GEMINI_MODEL="gemini-2.0-flash" # Optional

You can also configure these during gitwise init by selecting the specific provider. GitWise will automatically install the required dependencies for your chosen provider during initialization.

Features:

  • Use your own API keys and billing with providers.
  • Access to a wide range of models from each provider.
  • Potentially more up-to-date model access than through aggregators.
  • Internet connection required.
  • Required dependencies are automatically installed when you select a provider.

Mode Comparison

Feature Ollama Offline Online (OpenRouter) Direct LLM (OpenAI, Anthropic, Gemini)
Privacy ๐ŸŸข Full ๐ŸŸข Full ๐Ÿ”ด API calls ๐Ÿ”ด API calls to provider
Internet ๐ŸŸก Initial only ๐ŸŸข Never ๐Ÿ”ด Always ๐Ÿ”ด Always
Quality ๐ŸŸข High ๐ŸŸก Good ๐ŸŸข Best ๐ŸŸข Provider-dependent (Best available)
Speed ๐ŸŸข Fast ๐ŸŸข Fast ๐ŸŸก Network dependent ๐ŸŸก Network dependent
Cost ๐ŸŸข Free ๐ŸŸข Free ๐Ÿ”ด Per use ๐Ÿ”ด Per use (Provider billing)
Setup ๐ŸŸก Medium ๐ŸŸข Easy ๐ŸŸข Easy ๐ŸŸข Easy (API key)

๐Ÿ“– Usage Examples

Basic Workflow

# 1. Initialize GitWise (first time only)
gitwise init

# 2. Make your code changes
echo "print('Hello, GitWise!')" > hello.py

# 3. Stage changes interactively
gitwise add .
# Shows summary of changes and prompts for next action

# 4. Generate AI-powered commit message
gitwise commit
# AI analyzes your diff and suggests: "feat: add hello world script"

# 5. Push and create PR
gitwise push
# Offers to create a PR with AI-generated description

# 6. Create PR with labels and checklist
gitwise pr --labels --checklist

Streamlined Workflow (Auto-Confirm Mode)

# Perfect for rapid development or CI/CD environments
# Make your code changes
echo "print('Hello, GitWise!')" > hello.py

# One command does it all: stage โ†’ commit โ†’ push โ†’ PR
gitwise add . --yes
# โœ… Stages files
# โœ… Auto-commits with AI-generated message and grouping
# โœ… Auto-pushes changes  
# โœ… Auto-creates PR with labels and checklist
# ๐Ÿ›ก๏ธ Skips PR creation if on main/master branch

# Alternative short form
gitwise add . -y

Advanced Features

Group Complex Changes

# When you have multiple logical changes
gitwise commit --group
# AI suggests splitting into multiple commits:
# 1. "refactor: extract user validation logic"
# 2. "feat: add email verification"
# 3. "test: add user validation tests"

Smart Merge with AI Conflict Analysis

# AI-powered merge with conflict resolution assistance
gitwise merge feature/payment-system

# For conflicts, AI explains what's happening:
# ๐Ÿ” Analyzing merge: feature/payment-system
# โš ๏ธ 2 conflicts detected in config.py and requirements.txt
# ๐Ÿง  AI explains: "Both branches modified database config..."
# ๐Ÿ’ก AI suggests: "Combine both configurations..."
# ๐Ÿ› ๏ธ Manual resolution required - resolve conflicts then:
gitwise merge --continue

# Or abort if needed
gitwise merge --abort

Changelog Management

# Update changelog before release
gitwise changelog
# Suggests version based on commits (e.g., 1.2.0)
# Generates categorized changelog entries

# Auto-update changelog on every commit
gitwise setup-hooks

Git Command Passthrough

# Use any git command through gitwise
gitwise status
gitwise log --oneline -5
gitwise branch -a
gitwise stash list

๐Ÿ”ง Configuration

Environment Variables

# Core settings
export GITWISE_LLM_BACKEND=ollama  # ollama, offline, or online
export GITWISE_CONFIG_PATH=~/.gitwise/config.json  # custom config location

# Ollama settings
export OLLAMA_MODEL=llama3
export OLLAMA_URL=http://localhost:11434  # custom Ollama server

# Offline settings
export GITWISE_OFFLINE_MODEL="TinyLlama/TinyLlama-1.1B-Chat-v1.0"

# Online settings
export OPENROUTER_API_KEY="your_api_key"
export OPENROUTER_MODEL="anthropic/claude-3-haiku"

# Direct Provider Settings
# OpenAI
export GITWISE_LLM_BACKEND=openai
export OPENAI_API_KEY="your_openai_api_key"
export GITWISE_OPENAI_MODEL="gpt-4"
# Anthropic
export GITWISE_LLM_BACKEND=anthropic
export ANTHROPIC_API_KEY="your_anthropic_api_key"
export GITWISE_ANTHROPIC_MODEL="claude-3-opus-20240229"
# Google Gemini
export GITWISE_LLM_BACKEND=google_gemini
export GOOGLE_API_KEY="your_google_api_key"
export GITWISE_GEMINI_MODEL="gemini-2.0-flash"

Configuration File

After running gitwise init, your settings are saved in ~/.gitwise/config.json:

{
  "llm_backend": "ollama",
  "ollama": {
    "model": "llama3",
    "url": "http://localhost:11434"
  },
  "offline": {
    "model": "TinyLlama/TinyLlama-1.1B-Chat-v1.0"
  },
  "online": {
    "api_key": "your_api_key",
    "model": "anthropic/claude-3-haiku"
  },
  "openai": {
    "api_key": "your_openai_api_key",
    "model": "gpt-4"
  },
  "anthropic": {
    "api_key": "your_anthropic_api_key",
    "model": "claude-3-opus-20240229"
  },
  "google_gemini": {
    "api_key": "your_google_api_key",
    "model": "gemini-2.0-flash"
  }
}

๐Ÿ› ๏ธ Troubleshooting

Ollama Issues

# Check if Ollama is running
curl http://localhost:11434/api/tags

# Start Ollama service
ollama serve

# List available models
ollama list

# Pull a new model
ollama pull codellama

Switching Backends

# Quick switch via environment variable
export GITWISE_LLM_BACKEND=ollama
gitwise commit  # Now using Ollama mode

# Or reconfigure
gitwise init

๐Ÿ”ฅ Key Features

  • ๐Ÿ”„ Interactive Workflow: gitwise add does everything - stage โ†’ commit โ†’ push โ†’ PR in one flow
  • ๐Ÿค– AI Commit Messages: Generate perfect Conventional Commits from your changes
  • ๐Ÿง  Smart Auto-Grouping: Automatically groups related changes into separate commits
  • ๐Ÿ”€ Intelligent Merges: AI-powered conflict analysis and resolution assistance
  • ๐Ÿ“ Smart PR Descriptions: Detailed descriptions with automated labels and checklists
  • ๐Ÿ”’ Privacy-First: Local AI models (Ollama) keep your code on your machine
  • โš™๏ธ Git Compatible: Use as a drop-in replacement for Git commands
  • ๐Ÿ“Š Changelog Generation: Automated changelog updates
  • ๐ŸŽฏ Context Aware: Remembers branch context for better suggestions

๐Ÿ“š Learn More

๐Ÿค Contributing

Found a bug? Have a feature request? Contributions welcome!

๐Ÿ“„ License

Dual licensed: AGPL-3.0 for open source projects, Commercial license available for proprietary use.


Ready to transform your Git workflow?

pip install pygitwise && gitwise init

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygitwise-0.2.0.tar.gz (128.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pygitwise-0.2.0-py3-none-any.whl (115.1 kB view details)

Uploaded Python 3

File details

Details for the file pygitwise-0.2.0.tar.gz.

File metadata

  • Download URL: pygitwise-0.2.0.tar.gz
  • Upload date:
  • Size: 128.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for pygitwise-0.2.0.tar.gz
Algorithm Hash digest
SHA256 6ba8026ad8ef3a93ddbcb964fb47f7622d1d529579799895526f7c4237c0b502
MD5 698a0c252b2331b6ec3c94fc1dc0f563
BLAKE2b-256 acd0e278a3089aa6c41e41b80432c5b3a2ee9202b10331db3ef33577413e81ac

See more details on using hashes here.

File details

Details for the file pygitwise-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: pygitwise-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 115.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for pygitwise-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e21a9bf39b59607e2115360519d2f409f828b4e02640aba6673ce9a9e91b62a8
MD5 ed28df10f891a8bb4b2ae670d708de48
BLAKE2b-256 f57ead0b4d350cdd1b8ae6165f09ade03e39b6d08d2aa337e7a50b67a78ec1da

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page