Skip to main content

Freeplay integration for LangGraph and LangChain

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Freeplay Python LangGraph

Freeplay integration for LangGraph and LangChain, providing observability and prompt management for your AI applications.

Installation

pip install freeplay-python-langgraph

Features

  • 🔍 Automatic Observability: OpenTelemetry instrumentation for LangChain and LangGraph applications
  • 📝 Prompt Management: Call Freeplay-hosted prompts with version control and environment management
  • 🤖 Auto-Model Instantiation: Automatically create LangChain models based on Freeplay's configuration
  • 💬 Conversation History: Native support for multi-turn conversations with LangGraph MessagesState
  • 🛠️ Tool Support: Seamless integration with LangChain tools
  • 🧪 Test Execution Tracking: Track test runs and test cases for evaluation workflows
  • 🎯 Multi-Provider Support: Works with OpenAI, Anthropic, Vertex AI, and more

Quick Start

Configuration

Set up your environment variables:

export FREEPLAY_API_URL="https://app.freeplay.ai/api"
export FREEPLAY_API_KEY="fp-..."
export FREEPLAY_PROJECT_ID="..."

Or pass them directly when initializing:

from freeplay_python_langgraph import FreeplayLangGraph

freeplay = FreeplayLangGraph(
    freeplay_api_url="https://api.freeplay.ai",
    freeplay_api_key="fp_...",
    project_id="proj_...",
)

Usage

Prompt Management with Auto-Model Instantiation

Call a Freeplay-hosted prompt and let the SDK automatically instantiate the correct model:

from freeplay_python_langgraph import FreeplayLangGraph

freeplay = FreeplayLangGraph()

# Invoke a prompt - model is automatically created based on Freeplay's config
response = freeplay.invoke(
    prompt_name="weather-assistant",
    variables={"city": "San Francisco"},
    environment="production"
)

Using Custom Models

You can also provide your own pre-configured model:

from langchain_openai import ChatOpenAI
from freeplay_python_langgraph import FreeplayLangGraph

freeplay = FreeplayLangGraph()
model = ChatOpenAI(model="gpt-4", temperature=0.7)

response = freeplay.invoke(
    prompt_name="weather-assistant",
    variables={"city": "New York"},
    model=model
)

Conversation History (Multi-turn Chat)

Maintain conversation context with history:

from langchain_core.messages import HumanMessage, AIMessage
from freeplay_python_langgraph import FreeplayLangGraph

freeplay = FreeplayLangGraph()

# Build conversation history
history = [
    HumanMessage(content="What's the weather in Paris?"),
    AIMessage(content="It's sunny and 22°C in Paris."),
    HumanMessage(content="What about in winter?")
]

response = freeplay.invoke(
    prompt_name="weather-assistant",
    variables={"city": "Paris"},
    history=history
)

Tool Calling

Bind LangChain tools to your prompts:

from langchain_core.tools import tool
from freeplay_python_langgraph import FreeplayLangGraph

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    # Your weather API logic here
    return f"Weather in {city}: Sunny, 22°C"

freeplay = FreeplayLangGraph()

response = freeplay.invoke(
    prompt_name="weather-assistant",
    variables={"city": "London"},
    tools=[get_weather]
)

Test Execution Tracking

Track test runs for evaluation workflows:

from freeplay_python_langgraph import FreeplayLangGraph

freeplay = FreeplayLangGraph()

response = freeplay.invoke(
    prompt_name="my-prompt",
    variables={"input": "test input"},
    test_run_id="test_run_123",
    test_case_id="test_case_456"
)

Observability

The SDK automatically instruments your LangChain and LangGraph applications with OpenTelemetry. All traces are sent to Freeplay.

Provider Support

The SDK supports automatic model instantiation for the following providers:

  • OpenAI: Requires langchain-openai package
  • Anthropic: Requires langchain-anthropic package
  • Vertex AI: Requires langchain-google-vertexai package

Install the required provider package:

pip install langchain-openai
# or
pip install langchain-anthropic
# or
pip install langchain-google-vertexai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

freeplay_python_langgraph-0.1.2.tar.gz (273.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

freeplay_python_langgraph-0.1.2-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file freeplay_python_langgraph-0.1.2.tar.gz.

File metadata

File hashes

Hashes for freeplay_python_langgraph-0.1.2.tar.gz
Algorithm Hash digest
SHA256 e372e1c6f3e36bf07a56213ee4ba73a0564bbb8ae3cbac2582111dc954580078
MD5 b273e6a3ad092bdf109f8807a4f3eba9
BLAKE2b-256 a46225d9629510fd33fd983b175c0ed8e9313b006275a2053e80ef8638628a7d

See more details on using hashes here.

File details

Details for the file freeplay_python_langgraph-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for freeplay_python_langgraph-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0baa479f732735695c5ff6f63132dbe85960dbb382796ac697023876c2ff65d1
MD5 64a536c2bcd0ddcf9afc5e8dcccbf3b2
BLAKE2b-256 e4e219b42fd56b404447f934d15afc6f65ef763351422cd0815e3480826c317c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page