Python client library for the OpenPipe service
Project description
OpenPipe Python Client
This client allows you automatically report your OpenAI calls to OpenPipe.
Installation
pip install openpipe
Usage
- Create a project at https://app.openpipe.ai
- Find your project's API key at https://app.openpipe.ai/settings
- Configure the OpenPipe client as shown below.
from openpipe import OpenAI
import os
client = OpenAI(
# defaults to os.environ.get("OPENAI_API_KEY")
api_key="My API Key",
openpipe={
# Set the OpenPipe API key you got in step (2) above.
# If you have the `OPENPIPE_API_KEY` environment variable set we'll read from it by default
"api_key": "My OpenPipe API Key",
}
)
You can now use your new OpenAI client, which functions identically to the generic OpenAI client while also reporting calls to your OpenPipe instance.
Special Features
Tagging
OpenPipe has a concept of "tagging." This is very useful for grouping a certain set of completions together. When you're using a dataset for fine-tuning, you can select all the prompts that match a certain set of tags. Here's how you can use the tagging feature:
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "system", "content": "count to 10"}],
openpipe={
"tags": {"prompt_id": "counting"},
"log_request": True, # Enable/disable data collection. Defaults to True.
},
)
Should I Wait to Enable Logging?
We recommend keeping request logging turned on from the beginning. If you change your prompt you can just set a new prompt_id
tag so you can select just the latest version when you're ready to create a dataset.
Usage with langchain
Assuming you have created a project and have the openpipe key.
from openpipe.langchain_llm import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnableSequence
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"Classify user query into positive, negative or neutral.",
),
("human", "{query}"),
]
)
llm = ChatOpenAI(model="gpt-3.5-turbo")\
.with_tags(chain_name="classify", any_key="some")
# To provide the openpipe key explicitly
# llm = ChatOpenAI(model="gpt-3.5-turbo", openpipe_kwargs={"api_key": "My OpenPipe API Key"})\
# .with_tags(chain_name="classify", any_key="some")
chain: RunnableSequence = prompt | llm
res = chain.invoke(
{"query": "this is good"}
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openpipe-4.33.0.tar.gz
.
File metadata
- Download URL: openpipe-4.33.0.tar.gz
- Upload date:
- Size: 62.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 01b711ecdbc6b0a24d20bdb8fa464106af96c0a60cb546c0d8785b073e8e2c70 |
|
MD5 | 8e3a0bb59849877b3690a4f4435db478 |
|
BLAKE2b-256 | 68b20915f5199d87a0e41077a34ab779d35cdfa0155640888045e27a1b470794 |
File details
Details for the file openpipe-4.33.0-py3-none-any.whl
.
File metadata
- Download URL: openpipe-4.33.0-py3-none-any.whl
- Upload date:
- Size: 222.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Darwin/23.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 55efad83d31e83d381359174f10704da9bf4a08f3a204677890fdec5d7e2be61 |
|
MD5 | 868979a70d15f56915a8a666bdf225f2 |
|
BLAKE2b-256 | 00b422287008cce0ed24857d9c65c9aeb2059fa981119265c5bf3d34b758a6f9 |