No project description provided
Project description
Vellum Python Library
The Vellum Python SDK provides access to the Vellum API from python.
API Docs
You can find Vellum's complete API docs at docs.vellum.ai.
Installation
pip install --upgrade vellum-ai
Usage
Below is how you would invoke a deployed Prompt from the Vellum API. For a complete list of all APIs that Vellum supports, check out our API Reference.
from vellum import (
PromptDeploymentInputRequest_String,
)
from vellum.client import Vellum
client = Vellum(
api_key="YOUR_API_KEY",
)
def execute() -> str:
result = client.execute_prompt(
prompt_deployment_name="<example-deployment-name>>",
release_tag="LATEST",
inputs=[
PromptDeploymentInputRequest_String(
name="input_a",
type="STRING",
value="Hello, world!",
)
],
)
if result.state == "REJECTED":
raise Exception(result.error.message)
return result.outputs[0].value
if __name__ == "__main__":
print(execute())
[!TIP] You can set a system environment variable
VELLUM_API_KEY
to avoid writing your api key within your code. To do so, addexport VELLUM_API_KEY=<your-api-token>
to your ~/.zshrc or ~/.bashrc, open a new terminal, and then any code callingvellum.Vellum()
will read this key.
Async Client
This SDK has an async version. Here's how to use it:
import asyncio
import vellum
from vellum.client import AsyncVellum
client = AsyncVellum(api_key="YOUR_API_KEY")
async def execute() -> str:
result = await client.execute_prompt(
prompt_deployment_name="<example-deployment-name>>",
release_tag="LATEST",
inputs=[
vellum.PromptDeploymentInputRequest_String(
name="input_a",
value="Hello, world!",
)
],
)
if result.state == "REJECTED":
raise Exception(result.error.message)
return result.outputs[0].value
if __name__ == "__main__":
print(asyncio.run(execute()))
Contributing
While we value open-source contributions to this SDK, most of this library is generated programmatically.
Please feel free to make contributions to any of the directories or files below:
examples/*
src/vellum/lib/*
tests/*
README.md
Any additions made to files beyond those directories and files above would have to be moved over to our generation code (found in the separate vellum-client-generator repo), otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept, but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file vellum_ai-0.6.7.tar.gz
.
File metadata
- Download URL: vellum_ai-0.6.7.tar.gz
- Upload date:
- Size: 113.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.8.18 Linux/5.15.0-1064-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b637fac6523fb0ade48c6e8805d2503fd1c322b4bbea4d28198d5b31329d91c4 |
|
MD5 | 84483d9a0a787e291fd63a2dbe434840 |
|
BLAKE2b-256 | f00fcb9e7015670aad693ef20abebf94f68c537f3d715d645328c37af036dfdd |
Provenance
File details
Details for the file vellum_ai-0.6.7-py3-none-any.whl
.
File metadata
- Download URL: vellum_ai-0.6.7-py3-none-any.whl
- Upload date:
- Size: 352.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.8.18 Linux/5.15.0-1064-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b57c30ba0d8026a1ed8f6f0adaba92bc4bbc38b5d2077a765a23cd740ec5d22 |
|
MD5 | e07176bb8aca0b1ef85cb499276a9fe2 |
|
BLAKE2b-256 | 31ac0453b7e1adc29c031becd86ef7f0a942b658eb61bb79e9422093769a08ea |