Skip to main content

No project description provided

Project description

🤖 Copilot-Ollama-Windows

Use GitHub Copilot with OpenRouter models in VSCode Agent Mode on Windows

License: MIT Python 3.12+ uv

Quick StartConfigurationHow It WorksContributing


Note

This repo was forked from bascodes/copilot-ollama

🎯 Problem

OpenRouter provides access to various AI models from OpenAI, Anthropic, and others. However, GitHub Copilot's Agent Mode requires models to support function calling/tools, but OpenRouter's API doesn't announce tool support for its models.

This prevents using powerful models like Claude, GPT-4, and others through OpenRouter with Copilot's advanced Agent Mode features.

✨ Solution

copilot-ollama creates a local proxy chain that:

  • 🔄 Forwards requests to OpenRouter while preserving tool support
  • 🛠️ Makes OpenRouter models compatible with Copilot's Ollama integration
  • 🚀 Enables Agent Mode with any OpenRouter model
  • 🔧 Uses LiteLLM for OpenAI-compatible proxying
  • 🔗 Uses oai2ollama for Ollama compatibility

🚀 Quick Start

Prerequisites

  • OpenRouter API key (get one here)
  • VSCode with GitHub Copilot extension

Installation & Setup

  1. Install the package

    uv tool install copilot-ollama-windows
    
  2. Set your OpenRouter API key

    [System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
    
  3. Create a config file Checkout config.yaml as an example.

  4. Start the proxy servers

    copilot-ollama-windows your-config-file.yaml
    
  5. Configure VSCode

    • Open VSCode settings
    • Set github.copilot.chat.byok.ollamaEndpoint to http://localhost:11434
    • Click "Manage Models" → Select "Ollama"
  6. Start coding! 🎉 Your OpenRouter models are now available in Copilot Agent Mode.

⚙️ Configuration

Adding Models

Edit config.yaml to add or modify available models:

# This section defines the models that your local proxy will advertise
model_list:
  - model_name: kimi-k2  # Name that appears in VSCode
    litellm_params:
      model: openrouter/moonshotai/kimi-k2  # Actual OpenRouter model

  - model_name: claude-3-sonnet
    litellm_params:
      model: openrouter/anthropic/claude-3-sonnet

  - model_name: gpt-4-turbo
    litellm_params:
      model: openrouter/openai/gpt-4-turbo

Popular OpenRouter Models

Here are some recommended models to add:

Model Name OpenRouter Path Description
claude-3-sonnet openrouter/anthropic/claude-3-sonnet Excellent for code generation
gpt-4-turbo openrouter/openai/gpt-4-turbo Latest GPT-4 with improved performance
mixtral-8x7b openrouter/mistralai/mixtral-8x7b-instruct Fast and capable open-source model
llama-3-70b openrouter/meta-llama/llama-3-70b-instruct Meta's powerful open model

🔧 How It Works

graph LR
    A[VSCode Copilot] --> B[oai2ollama<br/>:11434]
    B --> C[LiteLLM Proxy<br/>:4000]
    C --> D[OpenRouter API]
    D --> E[AI Models<br/>Claude, GPT-4, etc.]
  1. VSCode Copilot sends requests to what it thinks is an Ollama server
  2. oai2ollama translates Ollama API calls to OpenAI format
  3. LiteLLM proxies OpenAI-compatible requests to OpenRouter
  4. OpenRouter routes to the actual AI model providers
  5. Tool/function calling capabilities are preserved throughout the chain

🤝 Contributing

We welcome contributions! Here's how you can help:

  • 🐛 Report bugs by opening an issue
  • 💡 Suggest features or improvements
  • 📖 Improve documentation
  • 🔧 Submit pull requests

Development Setup

# Clone the repo
git clone https://github.com/jm6271/copilot-ollama-windows.git
cd copilot-ollama-windows

# Install dependencies
uv sync

# Make your changes and test
[System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
uv run copilot-ollama-windows your-config-file.yaml

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments


⭐ Star this repo if it helped you unlock Copilot Agent Mode with your favorite models!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

copilot_ollama_windows-1.0.0.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

copilot_ollama_windows-1.0.0-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file copilot_ollama_windows-1.0.0.tar.gz.

File metadata

  • Download URL: copilot_ollama_windows-1.0.0.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for copilot_ollama_windows-1.0.0.tar.gz
Algorithm Hash digest
SHA256 8f1767aa790518ad96167c608424027f178f61c10690ce8751fc29d75e83203f
MD5 bd6760dfdee68eec8c007933b41edea2
BLAKE2b-256 1b7cb2a409a43336ff7442602049617bbd584cbeb7f3521ee25dc4960acb1fc1

See more details on using hashes here.

File details

Details for the file copilot_ollama_windows-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for copilot_ollama_windows-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b10e2be3d19905d7ea1afd533bffdcbe0a021bd82c97bf5fd29e581b5a347b46
MD5 731846e3cbc8b7a40a73dfe6ab12bcbd
BLAKE2b-256 51180d744a2e8f9ba06610ed1256ef1ace4726b774f6c0123f2a676b791e3575

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page