Skip to main content

A bridge to use Langchain output as an OpenAI-compatible API.

Project description

LangChain OpenAI API Bridge

PyPI version Downloads

🚀 Expose LangChain Agent (LangGraph) result as an OpenAI-compatible API 🚀

A FastAPI + LangChain / langgraph extension to expose agent result as an OpenAI-compatible API.

Use any OpenAI-compatible UI or UI framework with your custom LangChain Agent.

Support:

OpenAI API features:

  • Chat Completions API
    • ✅ Invoke
    • ✅ Stream
  • Assistant API - Feature in progress
    • ✅ Run Stream
    • ✅ Threads
    • ✅ Messages
    • ✅ Run
    • ✅ Tools step stream
    • 🚧 Human In The Loop

Supported Vendors:

  • ✅ OpenAI
    • ✅ Stream
    • ✅ Multimodal
    • ✅ Auto tool choice
  • ✅ Anthropic
    • ✅ Stream
    • ✅ Multimodal
    • ✅ Auto tool choice
  • ✅ Groq
    • ✅ Stream
    • ❌ Multimodal
    • ✅ Auto tool choice
  • ✅ LLamaCPP local inference
    • ✅ Stream
    • ❌ Multimodal
    • ❌ Auto tool choice

If you find this project useful, please give it a star ⭐!

Table of Content

Quick Install

pip
pip install langchain-openai-api-bridge
poetry
poetry add langchain-openai-api-bridge

Usage

OpenAI Assistant API Compatible

# Assistant Bridge as OpenAI Compatible API

from fastapi.middleware.cors import CORSMiddleware
from fastapi import FastAPI
from dotenv import load_dotenv, find_dotenv
import uvicorn

from langchain_openai_api_bridge.assistant import (
    InMemoryMessageRepository,
    InMemoryRunRepository,
    InMemoryThreadRepository,
)
from langchain_openai_api_bridge.fastapi.langchain_openai_api_bridge_fastapi import (
    LangchainOpenaiApiBridgeFastAPI,
)
from tests.test_functional.fastapi_assistant_agent_openai.my_agent_factory import (
    MyAgentFactory,
)

_ = load_dotenv(find_dotenv())


app = FastAPI(
    title="LangChain Agent OpenAI API Bridge",
    version="1.0",
    description="OpenAI API exposing langchain agent",
)

in_memory_thread_repository = InMemoryThreadRepository()
in_memory_message_repository = InMemoryMessageRepository()
in_memory_run_repository = InMemoryRunRepository()

bridge = LangchainOpenaiApiBridgeFastAPI(
    app=app, agent_factory_provider=lambda: MyAgentFactory()
)
bridge.bind_openai_assistant_api(
    thread_repository_provider=in_memory_thread_repository,
    message_repository_provider=in_memory_message_repository,
    run_repository_provider=in_memory_run_repository,
    prefix="/my-assistant",
)


if __name__ == "__main__":
    uvicorn.run(app, host="localhost")
# Agent Creation
@tool
def magic_number_tool(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2


class MyAgentFactory(AgentFactory):

    def create_agent(self, dto: CreateLLMDto) -> Runnable:
        llm = self.create_llm(dto=dto)

        return create_react_agent(
            llm,
            [magic_number_tool],
            prompt="""You are a helpful assistant.""",
        )

    def create_llm(self, dto: CreateLLMDto) -> Runnable:
        return ChatOpenAI(
            model=dto.model,
            api_key=dto.api_key,
            streaming=True,
            temperature=dto.temperature,
        )

Full example:

OpenAI Chat Completion API Compatible

# Server

from langchain_openai_api_bridge.assistant import (
    AssistantApp,
    InMemoryMessageRepository,
    InMemoryRunRepository,
    InMemoryThreadRepository,
)
from langchain_openai_api_bridge.fastapi import include_chat_completion

app = FastAPI(
    title="LangChain Agent OpenAI API Bridge",
    version="1.0",
    description="OpenAI API exposing langchain agent",
)

bridge = LangchainOpenaiApiBridgeFastAPI(
    app=app, agent_factory_provider=lambda: MyAnthropicAgentFactory()
)
bridge.bind_openai_chat_completion(prefix="/my-custom-path/anthropic")

if __name__ == "__main__":
    uvicorn.run(app, host="localhost")
# Client
openai_client = OpenAI(
    base_url="http://my-server/my-custom-path/anthropic/openai/v1",
)

chat_completion = openai_client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {
            "role": "user",
            "content": 'Say "This is a test"',
        }
    ],
)
print(chat_completion.choices[0].message.content)
#> "This is a test"

Full example:

// Vercel AI sdk - example
// ************************
// app/api/my-chat/route.ts
import { NextRequest } from "next/server";
import { z } from "zod";
import { type CoreMessage, streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";

export const ChatMessageSchema = z.object({
  id: z.string(),
  role: z.string(),
  createdAt: z.date().optional(),
  content: z.string(),
});

const BodySchema = z.object({
  messages: z.array(ChatMessageSchema),
});

export type AssistantStreamBody = z.infer<typeof BodySchema>;

const langchain = createOpenAI({
  //baseURL: "https://my-project/my-custom-path/openai/v1",
  baseURL: "http://localhost:8000/my-custom-path/openai/v1",
});

export async function POST(request: NextRequest) {
  const { messages }: { messages: CoreMessage[] } = await request.json();

  const result = await streamText({
    model: langchain("gpt-4o"),
    messages,
  });

  return result.toAIStreamResponse();
}

More Examples

More examples can be found in tests/test_functional directory. This project is not limited to OpenAI’s models; some examples demonstrate the use of Anthropic’s language models. Anthropic is just one example, and any LangChain-supported vendor is also supported by this library.

⚠️ Setup to run examples

Define OPENAI_API_KEY or ANTHROPIC_API_KEY on your system. Examples will take token from environment variable or .env at root of the project.

Contributing

If you want to contribute to this project, you can follow this guideline:

  1. Fork this project
  2. Create a new branch
  3. Implement your feature or bug fix
  4. Send a pull request

Installation

poetry install
poetry env use ./.venv/bin/python

Commands

Command Command
Run Tests poetry run pytest

Limitations

  • Chat Completions Tools

    • Functions cannot be passed through open ai API. Every functions need to be defined as a tool in langchain. Usage Example
  • LLM Usage Info

    • Returned usage info is innacurate. This is due to a LangChain/LangGraph limitation where usage info isn't available when calling a LangGraph Agent.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_openai_api_bridge-1.0.0.tar.gz (24.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_openai_api_bridge-1.0.0-py3-none-any.whl (48.6 kB view details)

Uploaded Python 3

File details

Details for the file langchain_openai_api_bridge-1.0.0.tar.gz.

File metadata

File hashes

Hashes for langchain_openai_api_bridge-1.0.0.tar.gz
Algorithm Hash digest
SHA256 7b848b650d7acbce59a928ce9ef87e02f679d637b70dab60914b2dd9800629da
MD5 09d14b30b695092fab01a9ce74ce4461
BLAKE2b-256 a2862b8c2a3140cb1b1d519789138541b181022d39627921bd832e1f96814352

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_openai_api_bridge-1.0.0.tar.gz:

Publisher: publish.yml on samuelint/langchain-openai-api-bridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_openai_api_bridge-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_openai_api_bridge-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c78010403112f63abb4ab83e4ead2cf72e14c1d51d764b3ff9a3e8495203b20d
MD5 a110f8fd41bda0a2687725ae3a07f0bd
BLAKE2b-256 796a329958b9d05acc564c5c2ce2824e553ac357546bed4472e9f9a8dbedabe4

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_openai_api_bridge-1.0.0-py3-none-any.whl:

Publisher: publish.yml on samuelint/langchain-openai-api-bridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page