Skip to main content

Super fast semantic router for AI decision making

Project description

Semantic Router

PyPI - Python Version GitHub Contributors GitHub Last Commit GitHub Issues GitHub Pull Requests Github License

Semantic Router is a superfast decision-making layer for your LLMs and agents. Rather than waiting for slow LLM generations to make tool-use decisions, we use the magic of semantic vector space to make those decisions — routing our requests using semantic meaning.

Read the Docs


Quickstart

To get started with semantic-router we install it like so:

pip install -qU semantic-router

❗️ If wanting to use a fully local version of semantic router you can use HuggingFaceEncoder and LlamaCppLLM (pip install -qU "semantic-router[local]", see here). To use the HybridRouteLayer you must pip install -qU "semantic-router[hybrid]".

We begin by defining a set of Route objects. These are the decision paths that the semantic router can decide to use, let's try two simple routes for now — one for talk on politics and another for chitchat:

from semantic_router import Route

# we could use this as a guide for our chatbot to avoid political conversations
politics = Route(
    name="politics",
    utterances=[
        "isn't politics the best thing ever",
        "why don't you tell me about your political opinions",
        "don't you just love the president",
        "they're going to destroy this country!",
        "they will save the country!",
    ],
)

# this could be used as an indicator to our chatbot to switch to a more
# conversational prompt
chitchat = Route(
    name="chitchat",
    utterances=[
        "how's the weather today?",
        "how are things going?",
        "lovely weather today",
        "the weather is horrendous",
        "let's go to the chippy",
    ],
)

# we place both of our decisions together into single list
routes = [politics, chitchat]

We have our routes ready, now we initialize an embedding / encoder model. We currently support a CohereEncoder and OpenAIEncoder — more encoders will be added soon. To initialize them we do:

import os
from semantic_router.encoders import CohereEncoder, OpenAIEncoder

# for Cohere
os.environ["COHERE_API_KEY"] = "<YOUR_API_KEY>"
encoder = CohereEncoder()

# or for OpenAI
os.environ["OPENAI_API_KEY"] = "<YOUR_API_KEY>"
encoder = OpenAIEncoder()

With our routes and encoder defined we now create a RouteLayer. The route layer handles our semantic decision making.

from semantic_router.routers import SemanticRouter

rl = SemanticRouter(encoder=encoder, routes=routes, auto_sync="local")

We can now use our route layer to make super fast decisions based on user queries. Let's try with two queries that should trigger our route decisions:

rl("don't you love politics?").name
[Out]: 'politics'

Correct decision, let's try another:

rl("how's the weather today?").name
[Out]: 'chitchat'

We get both decisions correct! Now lets try sending an unrelated query:

rl("I'm interested in learning about llama 2").name
[Out]:

In this case, no decision could be made as we had no matches — so our route layer returned None!

Integrations

The encoders of semantic router include easy-to-use integrations with Cohere, OpenAI, Hugging Face, FastEmbed, and more — we even support multi-modality!.

Our utterance vector space also integrates with Pinecone and Qdrant!


📚 Resources

Docs

Notebook Description
Introduction Introduction to Semantic Router and static routes
Dynamic Routes Dynamic routes for parameter generation and functionc calls
Save/Load Layers How to save and load RouteLayer from file
LangChain Integration How to integrate Semantic Router with LangChain Agents
Local Execution Fully local Semantic Router with dynamic routes — local models such as Mistral 7B outperform GPT-3.5 in most tests
Route Optimization How to train route layer thresholds to optimize performance
Multi-Modal Routes Using multi-modal routes to identify Shrek vs. not-Shrek pictures

Online Course

Semantic Router Course

Community

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

semantic_router-0.1.12.tar.gz (93.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

semantic_router-0.1.12-py3-none-any.whl (126.2 kB view details)

Uploaded Python 3

File details

Details for the file semantic_router-0.1.12.tar.gz.

File metadata

  • Download URL: semantic_router-0.1.12.tar.gz
  • Upload date:
  • Size: 93.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for semantic_router-0.1.12.tar.gz
Algorithm Hash digest
SHA256 b63fbb8b9127dcb1763efea17dfa74ab409e626e87c8695b589131af12ef3a65
MD5 0e9464afffa7a83490ac9587d39df60d
BLAKE2b-256 8dd788a1330f53a26eaea25249b21a5b776cbabfa333a6107ed88ce8b881d14f

See more details on using hashes here.

File details

Details for the file semantic_router-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: semantic_router-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 126.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for semantic_router-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 94658545f89cc63d2eb7dff6f74bc713b61bbcfe91146b0e4353a383f6790804
MD5 6ef16e53e06df561d3275b8fe25167e9
BLAKE2b-256 18ad4816aabd264b6b677002bde0cd4784f7f7f553f98e2ec01b96fda4ce5215

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page