Skip to main content

Easy access to 100s of LLMs with a few lines of code (using Openrouter).

Project description

irouter

PyPI version PyPI Downloads Python Version uv Ruff

irouter provides a simple interface to access 100s of LLMs with minimal lines of code.

Installation

  1. Install irouter from PyPI:
pip install irouter
  1. Create an account on OpenRouter and generate an API key.

3a. (recommended!) Set the OpenRouter API key as an environment variable:

export OPENROUTER_API_KEY=your_openrouter_api_key

In this way you can use irouter objects like Call and Chat without having to pass an API key.

from irouter import Call
c = Call("moonshotai/kimi-k2:free")
c("How are you?")

3b. Alternatively, pass api_key to irouter objects like Call and Chat.

from irouter import Call
c = Call("moonshotai/kimi-k2:free", api_key="your_openrouter_api_key")
c("How are you?")

Usage

Below are basic usage examples of functionality in irouter. For more detailed examples, check out the nbs folder.

Call

Call is the simplest interface to have one-off interactions with one or more LLMs (without tool support).

For conversational interactions use Chat, which tracks message history, token usage, and supports tool calling.

Single LLM

from irouter import Call
c = Call("moonshotai/kimi-k2:free")
c("Who are you?")
# "I'm Kimi, your AI friend from Moonshot AI. I'm here to chat, answer your questions, and help you out whenever you need it."

Multiple LLMs

from irouter import Call
c = Call(["moonshotai/kimi-k2:free", "google/gemini-2.0-flash-exp:free"])
c("Who are you?")
# {'moonshotai/kimi-k2:free': "I'm Kimi, your AI friend from Moonshot AI. I'm here to chat, answer your questions, and help you out whenever you need it.",
#  'google/gemini-2.0-flash-exp:free': 'I am a large language model, trained by Google.\n'}

Chat

Chat is an easy way to interface with one or more LLMs, while tracking message history, token usage, and supporting tool calling.

Single LLM

from irouter import Chat
c = Chat("moonshotai/kimi-k2:free")
c("Who are you?")
print(c.history) # {'moonshotai/kimi-k2:free': [...]}
print(c.usage) # {'moonshotai/kimi-k2:free': {'prompt_tokens': 8, 'completion_tokens': 8, 'total_tokens': 16}}

Multiple LLMs

from irouter import Chat
c = Chat(["moonshotai/kimi-k2:free", "google/gemini-2.0-flash-exp:free"])
c("Who are you?")
print(c.history) 
# {'moonshotai/kimi-k2:free': [...], 
# 'google/gemini-2.0-flash-exp:free': [...]}
print(c.usage) 
# {'moonshotai/kimi-k2:free': {'prompt_tokens': 8, 'completion_tokens': 8, 'total_tokens': 16}, 
# 'google/gemini-2.0-flash-exp:free': {'prompt_tokens': 8, 'completion_tokens': 10, 'total_tokens': 18}}

Image

Both Call and Chat support images from image URLs or local images.

Adding images is as simple as providing a list of strings with:

  • text and/or
  • image URL(s) and/or
  • image path(s)

Make sure to select an LLM that supports image input, like gpt-4o-mini.

Example image
from irouter import Chat
ic = Chat("gpt-4o-mini")
# Image URL
ic(["https://www.petlandflorida.com/wp-content/uploads/2022/04/shutterstock_1290320698-1-scaled.jpg", 
    "What is in the image?"])
# or local image
# ic(["../assets/puppy.jpg", "What is in the image?"])
# Example output:
# The image shows a cute puppy, ..., The background is blurred, 
# with green hues suggesting an outdoors setting.

# Images are tracked in history
print(ic.history)
# [{'role': 'system', 'content': 'You are a helpful assistant.'}, 
#  {'role': 'user', 'content': [{'type': 'image_url', 'image_url':
#  {'url': '...'}}, {'type': 'text', 'text': 'What is in the image?'}]}, 
#  {'role': 'assistant', 'content': 'The image shows a cute puppy...'}]

For more information on Chat, check out the chat.ipynb notebook in the nbs folder.

PDF

Both Call and Chat support PDF processing from URLs or local files.

from irouter import Call
c = Call("moonshotai/kimi-k2:free")
c(["https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf", 
   "What is the main contribution of this paper?"])
# 'The main contribution of this paper is the introduction of the Transformer architecture...'

Audio

Some LLMs have native audio support. Simply pass a local filepath that points to a .mp3 or .wav file with an instruction as a list of strings.

from irouter import Call
c = Call("google/gemini-2.5-flash")
c(["../assets/bottles.mp3", "What do you hear?"])
# 'I hear the sound of a glass bottle being opened and closed...'

Multiple Modalities

Combine text, images, PDFs, and audio in a single request. Simply pass a list of strings containing URLs, filepaths and/or text.

from irouter import Call
c = Call("google/gemini-2.5-flash")
c(["../assets/bottles.mp3", "../assets/puppy.jpg", "What do you hear and see?"])
# 'I hear sounds of glass and see a small, fluffy dog...'

Tool Usage

Chat supports (multi-turn) tool calling, allowing LLMs to execute functions you provide. Simply pass a list of functions as the tools parameter. irouter will take care of the rest.

To ensure the best tool usage experience:

  • Use the reStructuredText convention for function docstrings with :param tags, like the function below. In that case the tool schema will specifically include descriptions for each parameter.

  • Consider using type hints so the LLM knows what types to provide.

from datetime import datetime
from zoneinfo import ZoneInfo

def get_time(fmt: str="%Y-%m-%d %H:%M:%S", tz: str=None) -> str:
    """Returns the current time formatted as a string.

    :param fmt: Format string for strftime.
    :param tz: Optional timezone name (e.g., "UTC"). If given, uses that timezone.
    :returns: The formatted current time.
    """
    return datetime.now(ZoneInfo(tz)) if tz else datetime.now().strftime(fmt)

chat = Chat("gpt-4o-mini")
result = chat("What is the current time in New York City?", tools=[get_time])
# "'The current time in New York City is 7:45 AM on August 5, 2025.\n'"

Misc

get_all_models

You can easily get an overview of all 300+ models available using get_all_models.

Alternatively, browse OpenRouter's models page to view supported models on irouter.

from irouter.base import get_all_models
get_all_models()
# ['llm_provider1/model1', ... 'llm_providerx/modelx']

Credits

This project is built on top of the OpenRouter API infrastructure, which provides access to LLMs through a unified interface.

This project is inspired by Answer.AI's projects like cosette and claudette.

irouter generalizes this idea to support 100s of LLMs, which includes OpenAI, Anthropic and more. irouter also provides additional modalities and functionality to work with. This is possible thanks to OpenRouter's infrastructure.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

irouter-0.2.1.tar.gz (21.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

irouter-0.2.1-py3-none-any.whl (12.8 kB view details)

Uploaded Python 3

File details

Details for the file irouter-0.2.1.tar.gz.

File metadata

  • Download URL: irouter-0.2.1.tar.gz
  • Upload date:
  • Size: 21.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.11

File hashes

Hashes for irouter-0.2.1.tar.gz
Algorithm Hash digest
SHA256 ddcd7530b7f5fc5eac6cf4de8f392786bf2be98b536bf1ac6e908fd171459617
MD5 accfc2ebaac2b167bea1699af3bba8ae
BLAKE2b-256 7c2a032246a9eb7290d068193def4406ad33b86bed064d7eb57e3eabc0f6e490

See more details on using hashes here.

File details

Details for the file irouter-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: irouter-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 12.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.11

File hashes

Hashes for irouter-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1ea182941e751e077cef80904b040bd112a8ee20dfdf307b34e02aef0cc914a3
MD5 ba3817264611e5c0db56321833f44440
BLAKE2b-256 9dcc972ecfdd22558ea38914cd0bf2ca547b934e4bcdadca171871d3c24bc61e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page