Skip to main content

An adapter to integrate OpenAI Agents as LLM providers for LiveKit.

Project description

LiveKit OpenAI Agents Adapter

This library provides an adapter to integrate agents built with the openai-agents library (specifically, its agents module) as LLM providers within the LiveKit Agents framework.

It allows you to plug in your openai-agents based orchestrators, leveraging their capabilities like conversation history management, tool usage, and handoffs, while still utilizing the real-time audio and agent lifecycle management features of LiveKit Agents.

Features

  • Integrates agents built with the openai-agents library into LiveKit.
  • Facilitates the use of openai-agents features (like handoffs, structured output if designed in your agent) within LiveKit.
  • Configurable streaming: Choose between the original non-streaming approach (default) or real-time streaming responses.
  • Includes a utility to extract the last user message from the chat context for the agent.

Installation

pip install livekit_openai_agents

(Note: Ensure this package name matches your published package on PyPI.)

Alternatively, if you have this project cloned:

pip install .

Dependencies

Based on pyproject.toml:

  • livekit-agents[openai,elevenlabs,silero,turn-detector] == 1.0.17 (or your current version)
  • livekit-plugins-noise-cancellation~=0.2
  • openai-agents >= 0.0.14
  • pyee >= 9.0.0

Please refer to your pyproject.toml for the most up-to-date list of dependencies.

Usage

1. Define Your openai-agents Agent

First, create your agent(s) using the agents library. For example:

# your_openai_agents.py
from agents import Agent
from pydantic import BaseModel

class MathResponse(BaseModel):
    explanation: str
    answer: float

math_tutor_agent = Agent(
    name="MathTutor",
    description="A specialized agent that helps with math problems and provides explanations.",
    instructions="You are a math tutor. Explain your reasoning step-by-step and provide the final answer.",
    output_type=MathResponse, # Example of structured output
    # You can add handoffs, tools, etc. as per openai-agents documentation
)

# You might have other agents or a more complex setup, e.g., a triage agent
# from agents import Runner
# result = await Runner.run(math_tutor_agent, "What is 2+2?")
# print(result.final_output) # Output: MathResponse(explanation='...', answer=4.0)

2. Use the Adapter in Your LiveKit Agent

Import OpenAIAgentAdapter from this library and your openai-agents agent. Then, initialize the adapter and use it in your LiveKit AgentSession.

# your_livekit_app.py
import asyncio
from dotenv import load_dotenv, find_dotenv

from livekit import agents
from livekit.agents import Agent as LiveKitAgent, AgentSession, RoomInputOptions # Renamed to avoid clash
from livekit.plugins import openai, silero # or your preferred STT/TTS/VAD

# Import the adapter and your openai-agent
from livekit_openai_agents.adapter import OpenAIAgentAdapter # Assuming 'livekit_openai_agents' is the package name
from your_openai_agents import math_tutor_agent # The agent defined in step 1

# Load .env for API keys if necessary
# load_dotenv(find_dotenv())

class MyLiveKitAssistant(LiveKitAgent): # Renamed to avoid clash with openai-agents' Agent
    def __init__(self) -> None:
        super().__init__(instructions="You are a helpful voice AI assistant that can call specialized tutors.")

async def entrypoint(ctx: agents.JobContext):
    await ctx.connect()

    # 1. Initialize your openai-agent (it's already defined, just use the instance)
    # math_tutor_agent is already an instance

    # 2. Initialize the adapter with your openai-agent instance
    # streaming=False (default): Uses the original non-streaming approach
    # streaming=True: Enables real-time streaming responses
    openai_agent_llm_adapter = OpenAIAgentAdapter(
        orchestrator=math_tutor_agent,
        streaming=False  # Default behavior - uses original non-streaming approach
    )

    # To enable streaming, you can set streaming=True:
    # openai_agent_llm_adapter = OpenAIAgentAdapter(
    #     orchestrator=math_tutor_agent,
    #     streaming=True  # Enable streaming for real-time responses
    # )

    # You can also change streaming mode dynamically:
    # openai_agent_llm_adapter.set_streaming(True)   # Enable streaming
    # openai_agent_llm_adapter.set_streaming(False)  # Disable streaming (back to original approach)
    
    # Check current streaming status:
    # is_streaming = openai_agent_llm_adapter.is_streaming_enabled()

    # 3. Set up the AgentSession
    session = AgentSession(
        stt=openai.STT(), # Replace with your STT
        llm=openai_agent_llm_adapter, # Use the adapter here
        tts=openai.TTS(), # Replace with your TTS
        vad=silero.VAD.load(),
    )

    await session.start(
        room=ctx.room,
        agent=MyLiveKitAssistant(),
        # ... other options
    )

    # Example: Generate an initial greeting
    await session.generate_reply(
        instructions="Greet the user and ask how you can help."
    )

    print("Agent is ready and listening.")

if __name__ == "__main__":
    # Load .env for API keys (e.g., OPENAI_API_KEY, LIVEKIT_URL, LIVEKIT_API_KEY)
    dotenv_path = find_dotenv()
    if dotenv_path:
        print(f"Loading .env file from: {dotenv_path}")
        load_dotenv(dotenv_path)
    else:
        print("No .env file found. Ensure API keys and LiveKit connection info are set as environment variables.")
    
    agents.cli.run_app(agents.WorkerOptions(entrypoint_fnc=entrypoint))

3. Streaming Configuration

The OpenAIAgentAdapter supports both the original non-streaming approach and an optional streaming mode:

Non-Streaming Mode (Default)

# Use original non-streaming approach (default behavior)
adapter = OpenAIAgentAdapter(orchestrator=your_agent, streaming=False)
  • Uses the original implementation approach
  • Responses are delivered as complete messages
  • Maintains backward compatibility
  • Uses Runner.run() internally

Streaming Mode (Optional)

# Enable streaming responses
adapter = OpenAIAgentAdapter(orchestrator=your_agent, streaming=True)
  • Responses are delivered in real-time as they are generated
  • Users see text appearing progressively
  • Better user experience for longer responses
  • Uses Runner.run_streamed() internally

Dynamic Streaming Control

adapter = OpenAIAgentAdapter(orchestrator=your_agent)  # Default: streaming=False

# Change streaming mode at runtime
adapter.set_streaming(True)   # Enable streaming
adapter.set_streaming(False)  # Disable streaming (back to original approach)

# Check current streaming status
if adapter.is_streaming_enabled():
    print("Streaming is enabled")
else:
    print("Using original non-streaming approach")

See the examples/ directory (e.g., examples/tutors/adapter_example.py) for a more detailed, runnable example.

API Reference

OpenAIAgentAdapter

Constructor Parameters

  • orchestrator: The OpenAI Agents Agent instance to adapt
  • guardrail_handler: Optional function to handle guardrail trips
  • context: Optional context to provide to the agent
  • streaming: Whether to enable streaming responses (default: False)

Methods

  • set_streaming(streaming: bool): Sets the streaming mode for future chat calls
  • is_streaming_enabled() -> bool: Returns whether streaming is currently enabled
  • chat(...): Creates a chat stream (uses streaming setting)
  • generate(...): Generates a response string (always non-streaming)

Development

To set up for development:

git clone https://github.com/anilaltuner/livekit-openai-agents.git
cd livekit-openai-agents
pip install -e .

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

livekit_openai_agents-0.2.5.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

livekit_openai_agents-0.2.5-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file livekit_openai_agents-0.2.5.tar.gz.

File metadata

  • Download URL: livekit_openai_agents-0.2.5.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.11.11 Darwin/23.6.0

File hashes

Hashes for livekit_openai_agents-0.2.5.tar.gz
Algorithm Hash digest
SHA256 c5a18331b50c27ca9b999067ce2d276d1b87d61aa1cdf19cb4e5e614ed4de01d
MD5 ab826a2a73db6ba7931b7307f36eca6e
BLAKE2b-256 adc00a40e783acf867f318cc6ebe5ae9ae745162c3860aefeaec5a123275e6f2

See more details on using hashes here.

File details

Details for the file livekit_openai_agents-0.2.5-py3-none-any.whl.

File metadata

File hashes

Hashes for livekit_openai_agents-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 9a40d8ba273a2c149a74e63609f78e5231c5c8e68601b833348e760e29564f93
MD5 a135d0bc5c0c40cad44e1ea4081e21fb
BLAKE2b-256 ccaafbd86d769756e142160a9f89776a9626de0317867b7f43c0fd37e8c6ba19

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page