Skip to main content

Windows support for copilot-ollama

Project description

🤖 Copilot-Ollama-Windows

Use GitHub Copilot with OpenRouter models in VSCode Agent Mode on Windows

License: MIT Python 3.12+ uv

Quick StartConfigurationHow It WorksContributing


Note

This repo was forked from bascodes/copilot-ollama

🎯 Problem

OpenRouter provides access to various AI models from OpenAI, Anthropic, and others. However, GitHub Copilot's Agent Mode requires models to support function calling/tools, but OpenRouter's API doesn't announce tool support for its models.

This prevents using powerful models like Claude, GPT-4, and others through OpenRouter with Copilot's advanced Agent Mode features.

✨ Solution

copilot-ollama creates a local proxy chain that:

  • 🔄 Forwards requests to OpenRouter while preserving tool support
  • 🛠️ Makes OpenRouter models compatible with Copilot's Ollama integration
  • 🚀 Enables Agent Mode with any OpenRouter model
  • 🔧 Uses LiteLLM for OpenAI-compatible proxying
  • 🔗 Uses oai2ollama for Ollama compatibility

🚀 Quick Start

Prerequisites

  • OpenRouter API key (get one here)
  • VSCode with GitHub Copilot extension

Installation & Setup

  1. Set your OpenRouter API key

    [System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
    
  2. Create a config file Checkout config.yaml as an example.

  3. Start copilot-ollama-windows

    uvx copilot-ollama-windows your-config-file.yaml
    
  4. Configure VSCode

    • Open VSCode settings
    • Set github.copilot.chat.byok.ollamaEndpoint to http://localhost:11434
    • Click "Manage Models" → Select "Ollama"
  5. Start coding! 🎉 Your OpenRouter models are now available in Copilot Agent Mode.

⚙️ Configuration

Adding Models

Edit your config.yaml to add or modify available models:

# This section defines the models that your local proxy will advertise
model_list:
  - model_name: kimi-k2  # Name that appears in VSCode
    litellm_params:
      model: openrouter/moonshotai/kimi-k2  # Actual OpenRouter model

  - model_name: claude-3-sonnet
    litellm_params:
      model: openrouter/anthropic/claude-3-sonnet

  - model_name: gpt-4-turbo
    litellm_params:
      model: openrouter/openai/gpt-4-turbo

Popular OpenRouter Models

Here are some recommended models to add:

Model Name OpenRouter Path Description
claude-3-sonnet openrouter/anthropic/claude-3-sonnet Excellent for code generation
gpt-4-turbo openrouter/openai/gpt-4-turbo Latest GPT-4 with improved performance
mixtral-8x7b openrouter/mistralai/mixtral-8x7b-instruct Fast and capable open-source model
llama-3-70b openrouter/meta-llama/llama-3-70b-instruct Meta's powerful open model

🔧 How It Works

graph LR
    A[VSCode Copilot] --> B[oai2ollama<br/>:11434]
    B --> C[LiteLLM Proxy<br/>:4000]
    C --> D[OpenRouter API]
    D --> E[AI Models<br/>Claude, GPT-4, etc.]
  1. VSCode Copilot sends requests to what it thinks is an Ollama server
  2. oai2ollama translates Ollama API calls to OpenAI format
  3. LiteLLM proxies OpenAI-compatible requests to OpenRouter
  4. OpenRouter routes to the actual AI model providers
  5. Tool/function calling capabilities are preserved throughout the chain

🤝 Contributing

We welcome contributions! Here's how you can help:

  • 🐛 Report bugs by opening an issue
  • 💡 Suggest features or improvements
  • 📖 Improve documentation
  • 🔧 Submit pull requests

Development Setup

# Clone the repo
git clone https://github.com/jm6271/copilot-ollama-windows.git
cd copilot-ollama-windows

# Install dependencies
uv sync

# Make your changes and test
[System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
uv run copilot-ollama-windows your-config-file.yaml

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments


⭐ Star this repo if it helped you unlock Copilot Agent Mode with your favorite models!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

copilot_ollama_windows-1.0.2.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

copilot_ollama_windows-1.0.2-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file copilot_ollama_windows-1.0.2.tar.gz.

File metadata

  • Download URL: copilot_ollama_windows-1.0.2.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for copilot_ollama_windows-1.0.2.tar.gz
Algorithm Hash digest
SHA256 af04a355314bb9ece105c9b3f45703e997e29bb41d33221b58eef1fa755a05e7
MD5 0ef5e39873a142317c758bbfd24b252f
BLAKE2b-256 a4c69e148eccfb9558ffaeff6e3df343fd81c422a3888de0bd532fa65343169d

See more details on using hashes here.

Provenance

The following attestation bundles were made for copilot_ollama_windows-1.0.2.tar.gz:

Publisher: python-publish.yml on jm6271/copilot-ollama-windows

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file copilot_ollama_windows-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for copilot_ollama_windows-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 58f305e3de961fb2a65f8172f7a75f8260b3822846e4eadc711e6e62302b886b
MD5 f3e5814f5e842e31869ac5f541313bcb
BLAKE2b-256 2a598b525867319dd8de944feb8d794602846b23bf7e94cd996e3d523210eefb

See more details on using hashes here.

Provenance

The following attestation bundles were made for copilot_ollama_windows-1.0.2-py3-none-any.whl:

Publisher: python-publish.yml on jm6271/copilot-ollama-windows

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page