Opper Python client
Project description
Opper Python SDK
Install
pip install opperai
Configuration
Environment variable
OPPER_API_KEY
environment variable is read by the SDK if noapi_key
is provided to theClient
object.OPPER_PROJECT
is attached to traces and can be used for filtering in the Opper UI.OPPER_DEFAULT_MODEL
will define the model used by functions created with thefn
decorator
When using the fn
decorator the SDK client is automatically initialized with the OPPER_API_KEY
environment variable.
Using the fn
decorator
from opperai import fn
@fn()
def translate(text: str, target_language: str) -> str:
"""Translate text to a target language."""
print(translate("Hello","fr"))
>>> "Bonjour"
The fn
decorator automatically creates an Opper function ready to be called like any other function in your code. They're no different than any other function!
Calling functions
To call a function you created at https://platform.opper.ai you can use the following code:
from opperai import Client
from opperai.types import ChatPayload, Message
client = Client(api_key="your-api-key")
response = client.functions.chat("your-function-path",
ChatPayload(messages=[Message(role="user", content="hello")])
)
print(response)
Async function calling
import asyncio
from opperai import AsyncClient
from opperai.types import ChatPayload, Message
opper = AsyncClient(api_key="your-api-key")
async def main():
message = ""
async for response in await opper.functions.chat(
"your-function-path",
ChatPayload(messages=[Message(role="user", content="Hello, world!")]),
stream=True,
):
if response.delta is not None:
message += response.delta
print(message)
if __name__ == "__main__":
asyncio.run(main())
Streaming responses
from opperai import Client
from opperai.types import ChatPayload, Message
client = Client(api_key="op-xxxx")
response = client.functions.chat(
"joch/test",
ChatPayload(
messages=[Message(role="user", content="tell me a story.")],
),
stream=True,
)
for data in response:
print(data.delta, end="", flush=True)
Async streaming responses
import asyncio
from opperai import AsyncClient
from opperai.types import ChatPayload, Message
client = AsyncClient(api_key="your-api-key")
async def main():
async for response in await client.functions.chat(
"your-function-path",
ChatPayload(
messages=[Message(role="user", content="tell me a story.")],
),
stream=True,
):
print(response.delta, end="", flush=True)
if __name__ == "__main__":
asyncio.run(main())
Retrieval
client.indexes.retrieve(index_id=42, "Who is the president of the USA?", 3)
Examples
See examples in our documentation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
opperai-0.5.0.tar.gz
(30.6 kB
view hashes)
Built Distribution
Close
Hashes for opperai-0.5.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 33d8a44b09e2f9a185479fe547a9336d47894c1f9889045781c11c55d08f3102 |
|
MD5 | d3ed3f73165391696092d5919faf71f5 |
|
BLAKE2b-256 | 80963cfb4365d8ea9ed8994f88a2b869a8751bc6761e46d70dcbd363f4169e8d |