Windows support for copilot-ollama
Project description
🤖 Copilot-Ollama-Windows
Use GitHub Copilot with OpenRouter models in VSCode Agent Mode on Windows
Note
This repo was forked from bascodes/copilot-ollama
🎯 Problem
OpenRouter provides access to various AI models from OpenAI, Anthropic, and others. However, GitHub Copilot's Agent Mode requires models to support function calling/tools, but OpenRouter's API doesn't announce tool support for its models.
This prevents using powerful models like Claude, GPT-4, and others through OpenRouter with Copilot's advanced Agent Mode features.
✨ Solution
copilot-ollama creates a local proxy chain that:
- 🔄 Forwards requests to OpenRouter while preserving tool support
- 🛠️ Makes OpenRouter models compatible with Copilot's Ollama integration
- 🚀 Enables Agent Mode with any OpenRouter model
- 🔧 Uses LiteLLM for OpenAI-compatible proxying
- 🔗 Uses oai2ollama for Ollama compatibility
🚀 Quick Start
Prerequisites
- OpenRouter API key (get one here)
- VSCode with GitHub Copilot extension
Installation & Setup
-
Set your OpenRouter API key
[System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
-
Create a config file Checkout
config.yamlas an example. -
Start
copilot-ollama-windowsuvx copilot-ollama-windows your-config-file.yaml -
Configure VSCode
- Open VSCode settings
- Set
github.copilot.chat.byok.ollamaEndpointtohttp://localhost:11434 - Click "Manage Models" → Select "Ollama"
-
Start coding! 🎉 Your OpenRouter models are now available in Copilot Agent Mode.
⚙️ Configuration
Adding Models
Edit your config.yaml to add or modify available models:
# This section defines the models that your local proxy will advertise
model_list:
- model_name: kimi-k2 # Name that appears in VSCode
litellm_params:
model: openrouter/moonshotai/kimi-k2 # Actual OpenRouter model
- model_name: claude-3-sonnet
litellm_params:
model: openrouter/anthropic/claude-3-sonnet
- model_name: gpt-4-turbo
litellm_params:
model: openrouter/openai/gpt-4-turbo
Popular OpenRouter Models
Here are some recommended models to add:
| Model Name | OpenRouter Path | Description |
|---|---|---|
claude-3-sonnet |
openrouter/anthropic/claude-3-sonnet |
Excellent for code generation |
gpt-4-turbo |
openrouter/openai/gpt-4-turbo |
Latest GPT-4 with improved performance |
mixtral-8x7b |
openrouter/mistralai/mixtral-8x7b-instruct |
Fast and capable open-source model |
llama-3-70b |
openrouter/meta-llama/llama-3-70b-instruct |
Meta's powerful open model |
🔧 How It Works
graph LR
A[VSCode Copilot] --> B[oai2ollama<br/>:11434]
B --> C[LiteLLM Proxy<br/>:4000]
C --> D[OpenRouter API]
D --> E[AI Models<br/>Claude, GPT-4, etc.]
- VSCode Copilot sends requests to what it thinks is an Ollama server
- oai2ollama translates Ollama API calls to OpenAI format
- LiteLLM proxies OpenAI-compatible requests to OpenRouter
- OpenRouter routes to the actual AI model providers
- Tool/function calling capabilities are preserved throughout the chain
🤝 Contributing
We welcome contributions! Here's how you can help:
- 🐛 Report bugs by opening an issue
- 💡 Suggest features or improvements
- 📖 Improve documentation
- 🔧 Submit pull requests
Development Setup
# Clone the repo
git clone https://github.com/jm6271/copilot-ollama-windows.git
cd copilot-ollama-windows
# Install dependencies
uv sync
# Make your changes and test
[System.Environment]::SetEnvironmentVariable('OPENROUTER_API_KEY','your-openrouter-api-key-here', 'User')
uv run copilot-ollama-windows your-config-file.yaml
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- LiteLLM for the excellent proxy framework
- oai2ollama for Ollama compatibility
- OpenRouter for model access
- copilot-ollama for the original project
- The VSCode and GitHub Copilot teams
⭐ Star this repo if it helped you unlock Copilot Agent Mode with your favorite models!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file copilot_ollama_windows-1.0.2.tar.gz.
File metadata
- Download URL: copilot_ollama_windows-1.0.2.tar.gz
- Upload date:
- Size: 4.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
af04a355314bb9ece105c9b3f45703e997e29bb41d33221b58eef1fa755a05e7
|
|
| MD5 |
0ef5e39873a142317c758bbfd24b252f
|
|
| BLAKE2b-256 |
a4c69e148eccfb9558ffaeff6e3df343fd81c422a3888de0bd532fa65343169d
|
Provenance
The following attestation bundles were made for copilot_ollama_windows-1.0.2.tar.gz:
Publisher:
python-publish.yml on jm6271/copilot-ollama-windows
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
copilot_ollama_windows-1.0.2.tar.gz -
Subject digest:
af04a355314bb9ece105c9b3f45703e997e29bb41d33221b58eef1fa755a05e7 - Sigstore transparency entry: 530788366
- Sigstore integration time:
-
Permalink:
jm6271/copilot-ollama-windows@7a8f6f5e1c239183fc399abde42beacd00416b93 -
Branch / Tag:
refs/tags/v1.0.2 - Owner: https://github.com/jm6271
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@7a8f6f5e1c239183fc399abde42beacd00416b93 -
Trigger Event:
release
-
Statement type:
File details
Details for the file copilot_ollama_windows-1.0.2-py3-none-any.whl.
File metadata
- Download URL: copilot_ollama_windows-1.0.2-py3-none-any.whl
- Upload date:
- Size: 5.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
58f305e3de961fb2a65f8172f7a75f8260b3822846e4eadc711e6e62302b886b
|
|
| MD5 |
f3e5814f5e842e31869ac5f541313bcb
|
|
| BLAKE2b-256 |
2a598b525867319dd8de944feb8d794602846b23bf7e94cd996e3d523210eefb
|
Provenance
The following attestation bundles were made for copilot_ollama_windows-1.0.2-py3-none-any.whl:
Publisher:
python-publish.yml on jm6271/copilot-ollama-windows
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
copilot_ollama_windows-1.0.2-py3-none-any.whl -
Subject digest:
58f305e3de961fb2a65f8172f7a75f8260b3822846e4eadc711e6e62302b886b - Sigstore transparency entry: 530788400
- Sigstore integration time:
-
Permalink:
jm6271/copilot-ollama-windows@7a8f6f5e1c239183fc399abde42beacd00416b93 -
Branch / Tag:
refs/tags/v1.0.2 - Owner: https://github.com/jm6271
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@7a8f6f5e1c239183fc399abde42beacd00416b93 -
Trigger Event:
release
-
Statement type: