Skip to main content

Python client library for the Portkey API

Project description


Control Panel for AI Apps

pip install portkey-ai

Features

AI Gateway

Unified API Signature
If you've used OpenAI, you already know how to use Portkey with any other provider.
Interoperability
Write once, run with any provider. Switch between any model from_any provider seamlessly.
Automated Fallbacks & Retries
Ensure your application remains functional even if a primary service fails.
Load Balancing
Efficiently distribute incoming requests among multiple models.
Semantic Caching
Reduce costs and latency by intelligently caching results.
Virtual Keys
Secure your LLM API keys by storing them in Portkey vault and using disposable virtual keys.
Request Timeouts
Manage unpredictable LLM latencies effectively by setting custom request timeouts on requests.

Observability

Logging
Keep track of all requests for monitoring and debugging.
Requests Tracing
Understand the journey of each request for optimization.
Custom Metadata
Segment and categorize requests for better insights.
Feedbacks
Collect and analyse weighted feedback on requests from users.
Analytics
Track your app & LLM's performance with 40+ production-critical metrics in a single place.

Usage

Prerequisites

  1. Sign up on Portkey and grab your Portkey API Key
  2. Add your OpenAI key to Portkey's Virtual Keys page and keep it handy
# Installing the SDK

$ pip install portkey-ai
$ export PORTKEY_API_KEY=PORTKEY_API_KEY

Making a Request to OpenAI

  • Portkey fully adheres to the OpenAI SDK signature. You can instantly switch to Portkey and start using our production features right out of the box.
  • Just replace from openai import OpenAI with from portkey_ai import Portkey:
from portkey_ai import Portkey

portkey = Portkey(
    api_key="PORTKEY_API_KEY",
    virtual_key="VIRTUAL_KEY"
)

chat_completion = portkey.chat.completions.create(
    messages = [{ "role": 'user', "content": 'Say this is a test' }],
    model = 'gpt-4'
)

print(chat_completion)

Async Usage

  • Use AsyncPortkey instead of Portkey with await:
import asyncio
from portkey_ai import AsyncPortkey

portkey = AsyncPortkey(
    api_key="PORTKEY_API_KEY",
    virtual_key="VIRTUAL_KEY"
)

async def main():
    chat_completion = await portkey.chat.completions.create(
        messages=[{'role': 'user', 'content': 'Say this is a test'}],
        model='gpt-4'
    )

    print(chat_completion)

asyncio.run(main())

Check out Portkey docs for the full list of supported providers

follow on Twitter Discord

Contributing

Get started by checking out Github issues. Email us at support@portkey.ai or just ping on Discord to chat.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

portkey_ai-1.2.2-py3-none-any.whl (60.7 kB view details)

Uploaded Python 3

File details

Details for the file portkey_ai-1.2.2-py3-none-any.whl.

File metadata

  • Download URL: portkey_ai-1.2.2-py3-none-any.whl
  • Upload date:
  • Size: 60.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for portkey_ai-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 25813f9305085ba996e0411b090473fab71a6cdf0728ad4c7aa68838aa553c8e
MD5 a38d5d42b7e32304e4fd4fd3c866d503
BLAKE2b-256 c0ce25b5b69f87d72a3c0c45df51e68f046337c2fcf8be9b4c48e623f2f95fa4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page