Skip to main content

Opper Python client

Project description

Opper Python SDK

This is the Opper Python SDK. See below for getting started, and the docs for more information. The SDK has builtin documentation and examples in function docstrings, which should be visible in your code editor as you are using the functions.

Install

pip install opperai

Configuration

Environment variable

  • OPPER_API_KEY environment variable is read by the SDK if no api_key is provided to the Client object.
  • OPPER_PROJECT is attached to traces and can be used for filtering in the Opper UI.
  • OPPER_DEFAULT_MODEL will define the model used by functions created with the fn decorator

When using the fn decorator the SDK client is automatically initialized with the OPPER_API_KEY environment variable.

Using the fn decorator

from opperai import fn

@fn()
def translate(text: str, target_language: str) -> str:
    """Translate text to a target language."""


print(translate("Hello","fr"))

>>> "Bonjour"

The fn decorator automatically creates an Opper function ready to be called like any other function in your code. They're no different than any other function!

Using the fn decorator with images as inputs

from opperai import fn
from opperai.types import ImageContent
from pydantic import BaseModel
from typing import List

class Word(BaseModel):
    letters: List[str]

@fn(model="openai/gpt-4o")
def extract_letters(image: ImageContent) -> Word:
    """given an image extract the word it represents"""

letters = extract_letters(
    ImageContent.from_path("tests/fixtures/images/letters.png"),
)

print(letters)

Note: one need to select the model that can handle images as inputs, see models

Calling functions

To call a function you created at https://platform.opper.ai you can use the following code:

from opperai import Opper
from opperai.types import Message

opper = Opper()

function = opper.functions.create(
    "jokes/tell", 
    instructions="given a topic tell a joke",
)

response = function.chat(
    messages=[Message(role="user", content="topic: python")]
)

print(response)

Async function calling

import asyncio
from opperai import AsyncOpper
from opperai.types import Message

async def main():
    opper = AsyncOpper()

    function = await opper.functions.create(
        "jokes/tell", 
        instructions="given a topic tell a joke",
    )
    
    response = await function.chat(
        messages=[Message(role="user", content="topic: python")],
    )

    print(response)

if __name__ == "__main__":
    asyncio.run(main())

Streaming responses

from opperai import Opper
from opperai.types import Message

opper = Opper()

function = opper.functions.create(
    "jokes/tell", 
    instructions="given a topic tell a joke",
    description="tell a joke",
)

response = function.chat(
    messages=[Message(role="user", content="topic: python")],
    stream=True
)

for delta in response.deltas:
    print(delta, end="", flush=True)

Async streaming responses

import asyncio
from opperai import AsyncOpper
from opperai.types import Message


async def main():
    opper = AsyncOpper()
    
    function = await opper.functions.create(
        "jokes/tell", 
        instructions="given a topic tell a joke",
        description="tell a joke",
    )

    response = await function.chat(
        messages=[Message(role="user", content="topic: python")],
        stream=True,
    )

    async for delta in response.deltas:
        print(delta, end="", flush=True)


if __name__ == "__main__":
    asyncio.run(main())

Indexes

from opperai import Opper
from opperai.types import Document, Filter

opper = Opper()

index = opper.indexes.create("my-index")

index.upload_file("file.txt")

index.add(Document(key="key1", content="Hello world 1", metadata={"score": 0}))
index.add(Document(key="key1", content="Hello world 1", metadata={"score": 1}))
index.add(Document(key="key2", content="Hello world 3", metadata={"score": 0}))

response = index.query("Hello")
print(response)

response = index.query("Hello", filters=[Filter(key="score", operation="=", value="1")])
print(response)

Examples

See examples in our documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opperai-0.7.0.tar.gz (986.5 kB view hashes)

Uploaded Source

Built Distribution

opperai-0.7.0-py2.py3-none-any.whl (42.0 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page