Skip to main content

Wrap any OpenAI-compatible API into an Ollama-compatible API.

Project description

Oai2Ollama

This is a CLI tool that starts a server that wraps an OpenAI-compatible API and expose an Ollama-compatible API, which is useful for providing custom models for coding agents that don't support custom OpenAI APIs but do support Ollama (like GitHub Copilot for VS Code).

Usage

with Python

You can run directly via uvx (if you have uv installed) or pipx:

uvx oai2ollama --help
usage: oai2ollama [--api-key str] [--base-url HttpUrl] [--capabilities list[str]] [--models list[str]] [--host str]
options:
  --help, -h                    Show this help message and exit
  --api-key str                 API key for authentication (required)
  --base-url HttpUrl            Base URL for the OpenAI-compatible API (required)
  --capabilities, -c list[str]  Extra capabilities to mark the model as supporting
  --models, -m list[str]        Extra models to include in the /api/tags response
  --host str                    IP / hostname for the API server (default: localhost)

To mark the model as supporting certain capabilities, you can use the --capabilities (or -c) option with a list of strings. For example, the following two syntaxes are supported:

oai2ollama -c tools or oai2ollama --capabilities tools

oai2ollama -c tools -c vision or oai2ollama --capabilities -c tools,vision

To support models that are not returned by the /models endpoint, use the --models (or -m) option to add them to the /api/tags response:

oai2ollama -m model1 -m model2 or oai2ollama -m model1,model2

Capabilities currently used by Ollama are: tools, insert, vision, embedding, thinking and completion. We always include completion.

Or you can use a .env file to set these options:

OPENAI_API_KEY=your_api_key
OPENAI_BASE_URL=your_base_url
HOST=0.0.0.0
CAPABILITIES=["vision","thinking"]
MODELS=["custom-model1","custom-model2"]

The option name capacities is deprecated. Use capabilities instead. The old name still works for now but will emit a deprecation warning.

with Docker

First, build the image:

docker build -t oai2ollama .

Then, run the container with your credentials:

docker run -p 11434:11434 \
  -e OPENAI_API_KEY="your_api_key" \
  -e OPENAI_BASE_URL="your_base_url" \
  oai2ollama

Or you can pass these as command line arguments:

docker run -p 11434:11434 oai2ollama --api-key your_api_key --base-url your_base_url

To have the server listen on a different host, like all IPv6 interfaces, use the --host argument:

docker run -p 11434:11434 oai2ollama --host "::"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oai2ollama-1.2.8.1-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file oai2ollama-1.2.8.1-py3-none-any.whl.

File metadata

  • Download URL: oai2ollama-1.2.8.1-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for oai2ollama-1.2.8.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5fa7dd74fd351c908b8905bc08cebe41fd2bd862ee832efd23c7e5d726a839e7
MD5 fc35574b84b8176734d477f8da4c6b76
BLAKE2b-256 75b705df81844d292ec8c6d61a9c4e70b306691d7b0a0a1ca38ddbcd3c94b9d7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page