Skip to main content

OpenPipe Python SDK: Fine-Tuning, Inference, and Metrics for Production Apps

Project description

OpenPipe Python Client

This client allows you automatically report your OpenAI calls to OpenPipe.

Installation

pip install openpipe

Usage

  1. Create a project at https://app.openpipe.ai
  2. Find your project's API key at https://app.openpipe.ai/settings
  3. Configure the OpenPipe client as shown below.
from openpipe import OpenAI
import os

client = OpenAI(
    # defaults to os.environ.get("OPENAI_API_KEY")
    api_key="My API Key",
    openpipe={
        # Set the OpenPipe API key you got in step (2) above.
        # If you have the `OPENPIPE_API_KEY` environment variable set we'll read from it by default
        "api_key": "My OpenPipe API Key",
    }
)

You can now use your new OpenAI client, which functions identically to the generic OpenAI client while also reporting calls to your OpenPipe instance.

Special Features

Metadata Tagging

OpenPipe follows OpenAI’s concept of metadata tagging for requests. This is very useful for grouping a certain set of completions together. These tags will help you find all the input/output pairs associated with a certain prompt and fine-tune a model to replace it. Here's how you can use the tagging feature:

completion = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "system", "content": "count to 10"}],
    metadata={"prompt_id": "counting"},
    store=True,
)

Should I Wait to Enable Logging?

We recommend keeping request logging turned on from the beginning. If you change your prompt you can just set a new prompt_id tag so you can select just the latest version when you're ready to create a dataset.

Usage with langchain

Assuming you have created a project and have the openpipe key.

from openpipe.langchain_llm import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnableSequence

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "Classify user query into positive, negative or neutral.",
        ),
        ("human", "{query}"),
    ]
)
llm = ChatOpenAI(model="gpt-4o")
    .with_tags(chain_name="classify", any_key="some")

# To provide the openpipe key explicitly
# llm = ChatOpenAI(model="gpt-4o", openpipe_kwargs={"api_key": "My OpenPipe API Key"})\
#     .with_tags(chain_name="classify", any_key="some")

chain: RunnableSequence = prompt | llm
res = chain.invoke(
    {"query": "this is good"}
)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openpipe-5.0.0.tar.gz (98.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openpipe-5.0.0-py3-none-any.whl (440.0 kB view details)

Uploaded Python 3

File details

Details for the file openpipe-5.0.0.tar.gz.

File metadata

  • Download URL: openpipe-5.0.0.tar.gz
  • Upload date:
  • Size: 98.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.0 Darwin/24.5.0

File hashes

Hashes for openpipe-5.0.0.tar.gz
Algorithm Hash digest
SHA256 040acc526fece42ba505fcedd8cd584f42482c9bd01f16b2538c9ea9c82882f4
MD5 32e29e575826763aaee1eab7fab5300a
BLAKE2b-256 7c34b487bc0ff60d3ed634e6f7bc34b5138f04e6ae319cc6578001822df93901

See more details on using hashes here.

File details

Details for the file openpipe-5.0.0-py3-none-any.whl.

File metadata

  • Download URL: openpipe-5.0.0-py3-none-any.whl
  • Upload date:
  • Size: 440.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.0 Darwin/24.5.0

File hashes

Hashes for openpipe-5.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c04af7afb4d9bcd52e1250757dd93d0e0ed19c9ff4b524f131dd94aadf4c1a9b
MD5 2f62a0fb26bdbd03e23894747636bdff
BLAKE2b-256 7a5e516010c25a32884a87e1f8303a292f3981fa382cc7570a9ed88fb28681d5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page