Skip to main content

No project description provided

Project description

Vellum Python Library

pypi license badge fern shield

The Vellum Python SDK provides access to the Vellum API from python.

API Docs

You can find Vellum's complete API docs at docs.vellum.ai.

Installation

pip install --upgrade vellum-ai

Usage

Below is how you would invoke a deployed Prompt from the Vellum API. For a complete list of all APIs that Vellum supports, check out our API Reference.

from vellum import (
    PromptDeploymentInputRequest_String,
)
from vellum.client import Vellum

client = Vellum(
    api_key="YOUR_API_KEY",
)

def execute() -> str:
    result = client.execute_prompt(
        prompt_deployment_name="<example-deployment-name>>",
        release_tag="LATEST",
        inputs=[
            PromptDeploymentInputRequest_String(
                name="input_a",
                type="STRING",
                value="Hello, world!",
            )
        ],
    )
    
    if result.state == "REJECTED":
        raise Exception(result.error.message)

    return result.outputs[0].value

if __name__ == "__main__":
    print(execute())

[!TIP] You can set a system environment variable VELLUM_API_KEY to avoid writing your api key within your code. To do so, add export VELLUM_API_KEY=<your-api-token> to your ~/.zshrc or ~/.bashrc, open a new terminal, and then any code calling vellum.Vellum() will read this key.

Async Client

This SDK has an async version. Here's how to use it:

import asyncio

import vellum
from vellum.client import AsyncVellum

client = AsyncVellum(api_key="YOUR_API_KEY")

async def execute() -> str:
    result = await client.execute_prompt(
        prompt_deployment_name="<example-deployment-name>>",
        release_tag="LATEST",
        inputs=[
            vellum.PromptDeploymentInputRequest_String(
                name="input_a",
                value="Hello, world!",
            )
        ],
    )

    if result.state == "REJECTED":
        raise Exception(result.error.message)
    
    return result.outputs[0].value

if __name__ == "__main__":
    print(asyncio.run(execute()))

Contributing

While we value open-source contributions to this SDK, most of this library is generated programmatically.

Please feel free to make contributions to any of the directories or files below:

examples/*
src/vellum/lib/*
tests/*
README.md

Any additions made to files beyond those directories and files above would have to be moved over to our generation code (found in the separate vellum-client-generator repo), otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept, but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!

Project details


Release history Release notifications | RSS feed

This version

0.7.9

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vellum_ai-0.7.9.tar.gz (147.7 kB view details)

Uploaded Source

Built Distribution

vellum_ai-0.7.9-py3-none-any.whl (445.2 kB view details)

Uploaded Python 3

File details

Details for the file vellum_ai-0.7.9.tar.gz.

File metadata

  • Download URL: vellum_ai-0.7.9.tar.gz
  • Upload date:
  • Size: 147.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.8.18 Linux/5.15.0-1068-azure

File hashes

Hashes for vellum_ai-0.7.9.tar.gz
Algorithm Hash digest
SHA256 54b3d0ec8d9f682f928e40d76843c32c02e1a7c9f5c65b940cf2dbaa10dfa8e3
MD5 fdbe23f81be7347d049aaab4dcc79cc1
BLAKE2b-256 8bf1e77cfee8cd80a0071dcea546531a66aa86b5d93b34328b90a779b4170f02

See more details on using hashes here.

Provenance

File details

Details for the file vellum_ai-0.7.9-py3-none-any.whl.

File metadata

  • Download URL: vellum_ai-0.7.9-py3-none-any.whl
  • Upload date:
  • Size: 445.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.8.18 Linux/5.15.0-1068-azure

File hashes

Hashes for vellum_ai-0.7.9-py3-none-any.whl
Algorithm Hash digest
SHA256 77b1771a4a24076e431c140f097bc15b3a651e4cdc95fc9d2a04aa3132a47ea5
MD5 446a9b8d1fa3358ac1fd5417dbed01a4
BLAKE2b-256 4f3bb97fa4a83313a07bdc267cb1ff350e63987655379eba597c38a6a9b7a5dd

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page