Skip to main content

Windows support for copilot-ollama

Project description

🤖 Copilot-Ollama-Windows

Use GitHub Copilot with OpenRouter models in VSCode Agent Mode on Windows

License: MIT Python 3.12+ uv

Quick StartConfigurationHow It WorksContributing


Note

This repo was forked from bascodes/copilot-ollama

🎯 Problem

OpenRouter provides access to various AI models from OpenAI, Anthropic, and others. However, GitHub Copilot's Agent Mode requires models to support function calling/tools, but OpenRouter's API doesn't announce tool support for its models.

This prevents using powerful models like Claude, GPT-4, and others through OpenRouter with Copilot's advanced Agent Mode features.

✨ Solution

copilot-ollama creates a local proxy chain that:

  • 🔄 Forwards requests to OpenRouter while preserving tool support
  • 🛠️ Makes OpenRouter models compatible with Copilot's Ollama integration
  • 🚀 Enables Agent Mode with any OpenRouter model
  • 🔧 Uses LiteLLM for OpenAI-compatible proxying
  • 🔗 Uses oai2ollama for Ollama compatibility

🚀 Quick Start

Prerequisites

  • OpenRouter API key (get one here)
  • VSCode with GitHub Copilot extension

Installation & Setup

  1. Set your OpenRouter API key

    [System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
    
  2. Create a config file Checkout config.yaml as an example.

  3. Start copilot-ollama-windows

    uvx copilot-ollama-windows your-config-file.yaml
    
  4. Configure VSCode

    • Open VSCode settings
    • Set github.copilot.chat.byok.ollamaEndpoint to http://localhost:11434
    • Click "Manage Models" → Select "Ollama"
  5. Start coding! 🎉 Your OpenRouter models are now available in Copilot Agent Mode.

⚙️ Configuration

Adding Models

Edit your config.yaml to add or modify available models:

# This section defines the models that your local proxy will advertise
model_list:
  - model_name: kimi-k2  # Name that appears in VSCode
    litellm_params:
      model: openrouter/moonshotai/kimi-k2  # Actual OpenRouter model

  - model_name: claude-3-sonnet
    litellm_params:
      model: openrouter/anthropic/claude-3-sonnet

  - model_name: gpt-4-turbo
    litellm_params:
      model: openrouter/openai/gpt-4-turbo

Popular OpenRouter Models

Here are some recommended models to add:

Model Name OpenRouter Path Description
claude-3-sonnet openrouter/anthropic/claude-3-sonnet Excellent for code generation
gpt-4-turbo openrouter/openai/gpt-4-turbo Latest GPT-4 with improved performance
mixtral-8x7b openrouter/mistralai/mixtral-8x7b-instruct Fast and capable open-source model
llama-3-70b openrouter/meta-llama/llama-3-70b-instruct Meta's powerful open model

🔧 How It Works

graph LR
    A[VSCode Copilot] --> B[oai2ollama<br/>:11434]
    B --> C[LiteLLM Proxy<br/>:4000]
    C --> D[OpenRouter API]
    D --> E[AI Models<br/>Claude, GPT-4, etc.]
  1. VSCode Copilot sends requests to what it thinks is an Ollama server
  2. oai2ollama translates Ollama API calls to OpenAI format
  3. LiteLLM proxies OpenAI-compatible requests to OpenRouter
  4. OpenRouter routes to the actual AI model providers
  5. Tool/function calling capabilities are preserved throughout the chain

🤝 Contributing

We welcome contributions! Here's how you can help:

  • 🐛 Report bugs by opening an issue
  • 💡 Suggest features or improvements
  • 📖 Improve documentation
  • 🔧 Submit pull requests

Development Setup

# Clone the repo
git clone https://github.com/jm6271/copilot-ollama-windows.git
cd copilot-ollama-windows

# Install dependencies
uv sync

# Make your changes and test
[System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
uv run copilot-ollama-windows your-config-file.yaml

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments


⭐ Star this repo if it helped you unlock Copilot Agent Mode with your favorite models!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

copilot_ollama_windows-1.0.1.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

copilot_ollama_windows-1.0.1-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file copilot_ollama_windows-1.0.1.tar.gz.

File metadata

  • Download URL: copilot_ollama_windows-1.0.1.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for copilot_ollama_windows-1.0.1.tar.gz
Algorithm Hash digest
SHA256 d6f58317af9271dc9d55065aae4c6ec99a1c2acfd8f5eb774df53dc576e6928c
MD5 5efc53e84f2161ef2fa768263e8f216b
BLAKE2b-256 90839ecee414c3017b7b70a87ca8053f4b29a97c68e093c6164cd860987c597d

See more details on using hashes here.

File details

Details for the file copilot_ollama_windows-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for copilot_ollama_windows-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1dbe35da35fb0d6a1d1af326de44fd572bca15dac46980c6872b0d0797d4aed5
MD5 f7b668e14d254b87717866986c62875d
BLAKE2b-256 b6b8c0ca7f2a93d882dffb076b05806b49ae38e73bf71db1a242eeb76befe883

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page