Skip to main content

Automatically route your prompts to the best OpenAI model

Project description

OpenAI Model Router

Given the number of OpenAI models, it can get tricky to identify which model to call for a given prompt. So we built the Model Router, to dynamically select the most appropriate model for our prompt.

List of supported models: gpt-3.5-turbo, gpt-3.5-turbo-0613, gpt-3.5-turbo-16k, gpt-3.5-turbo-16k-0613 Support for more models coming soon ;)

Usage

Simply use the Model Router instead of using OpenAI directly. And, no need to specify the model anymore!

Example 1: Simple Message

Before

import openai
openai.api_key = "sk-..."

chat_completion = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello world"}]
)

print(chat_completion.choices[0].message.content)

After

from model_router import openai # import from model_router instead
openai.api_key = "sk-..."

chat_completion = openai.ChatCompletion.create(
    # skip the model
    messages=[{"role": "user", "content": "Hello world"}]
)

print(chat_completion.choices[0].message.content)

Example 2: Function Calling

Before

import openai
openai.api_key = "sk-..."

chat_completion = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=[{"role": "user", "content": "What's the weather like in Boston?"}],
    functions=[...],
    function_call="auto",
)

print(chat_completion.choices[0].message.content)

After

from model_router import openai # import from model_router instead
openai.api_key = "sk-..."

chat_completion = openai.ChatCompletion.create(
    # skip the model
    messages=[{"role": "user", "content": "What's the weather like in Boston?"}],
    functions=[...],
    function_call="auto",
)

print(chat_completion.choices[0].message.content)

Questions/Feedback

For any questions/feedback, please reach out to @sambuddha_basu

Project details


Release history Release notifications | RSS feed

This version

0.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai-model-router-0.1.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

openai_model_router-0.1-py3-none-any.whl (3.8 kB view details)

Uploaded Python 3

File details

Details for the file openai-model-router-0.1.tar.gz.

File metadata

  • Download URL: openai-model-router-0.1.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for openai-model-router-0.1.tar.gz
Algorithm Hash digest
SHA256 bcb866fc47ad782e48eab15fa5ef39aaa7290681a403315cbbefd3543044ff15
MD5 b79e0b47a86363d7d54a0c9b7e79c2d8
BLAKE2b-256 641e522b34738454805812a75d3cc6ff487932fb88e48c76690e1a96e22e2624

See more details on using hashes here.

File details

Details for the file openai_model_router-0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_model_router-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 515f147578dddd91f203a043ff9ddeade9b6149273a9d8a9f31d2cf86330d983
MD5 48d29939e5812ed3bb861062e1ec3187
BLAKE2b-256 5d8217d71eb5d871e626d56dd48f304a2618c44cd0f7b02c5c85285e0e51adca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page