Python SDK for the Promptic platform — tracing, API client, and CLI.
Project description
Promptic Python SDK
Python SDK and CLI for the Promptic platform — tracing, prompt optimization, and experiment management.
Installation
pip install promptic-sdk
Optional LLM instrumentation
Install extras to auto-instrument specific LLM providers:
pip install promptic-sdk[openai] # OpenAI
pip install promptic-sdk[anthropic] # Anthropic
pip install promptic-sdk[langchain] # LangChain
pip install promptic-sdk[all] # All providers
Quick start
1. Authenticate
Log in via browser (recommended for local development):
promptic login
This opens your browser for authentication, then auto-selects your workspace. Credentials are saved to ~/.promptic/config.toml.
For CI/CD or headless environments, use an API key instead:
promptic configure
# or set the environment variable:
export PROMPTIC_API_KEY="pk_..."
2. Send traces
import promptic_sdk
from openai import OpenAI
# Initialize tracing (auto-instruments installed LLM libraries)
promptic_sdk.init()
client = OpenAI()
# Tag traces with an AI Component name
with promptic_sdk.ai_component("customer-support-agent"):
response = client.chat.completions.create(
model="gpt-4.1-nano",
messages=[{"role": "user", "content": "Hello!"}],
)
3. Use the API client
from promptic_sdk import PrompticClient
with PrompticClient() as client:
# List traces
traces = client.list_traces(limit=10)
# Get workspace info
workspace = client.get_workspace()
# Manage experiments
experiment = client.create_experiment(
ai_component_id="comp_...",
target_model="gpt-4.1-nano",
task_type="classification",
initial_prompt="Classify the following text.",
)
# Deploy the best prompt
client.deploy(component_id="comp_...", experiment_id="exp_...")
# Fetch a deployed prompt at runtime
prompt = client.get_deployed_prompt("comp_...")
Tracing
promptic_sdk.init() sets up OpenTelemetry to export spans to the Promptic platform.
| Parameter | Description | Default |
|---|---|---|
api_key |
Promptic API key (falls back to PROMPTIC_API_KEY) |
— |
endpoint |
Platform URL (falls back to PROMPTIC_ENDPOINT) |
https://promptic.eu |
auto_instrument |
Auto-detect and instrument LLM client libraries | True |
service_name |
OpenTelemetry service.name resource attribute |
— |
Auto-detected instrumentors: OpenAI, Anthropic, Google Generative AI, LangChain, Cohere.
Using other OpenTelemetry instrumentors
Since Promptic uses standard OpenTelemetry under the hood, you can add any OTel-compatible instrumentor alongside the auto-detected ones. Just call promptic_sdk.init() first, then instrument manually:
import promptic_sdk
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
promptic_sdk.init()
# Add any OpenTelemetry instrumentor — spans will be exported to Promptic
RequestsInstrumentor().instrument()
SQLAlchemyInstrumentor().instrument(engine=engine)
This works with any package from the opentelemetry-python-contrib ecosystem (HTTP clients, databases, web frameworks, etc.). All spans are exported to the Promptic platform as long as init() has been called.
AI Components
Use ai_component() to tag spans with a component name. The platform links traces to the matching AI Component in your workspace:
with promptic_sdk.ai_component("my-component"):
# All LLM calls here are tagged
...
API client
Both a sync (PrompticClient) and async (AsyncPrompticClient) client are available. They share the same method signatures and return types.
from promptic_sdk import PrompticClient
with PrompticClient() as client:
traces = client.list_traces(limit=10)
from promptic_sdk import AsyncPrompticClient
async with AsyncPrompticClient() as client:
traces = await client.list_traces(limit=10)
Both clients provide typed methods for the full Promptic REST API:
| Resource | Methods |
|---|---|
| Workspace | get_workspace |
| Traces | list_traces, get_trace, get_stats |
| Components | list_components, get_component, create_component, delete_component |
| Experiments | list_experiments, get_experiment, create_experiment, update_experiment, delete_experiment, start_experiment |
| Observations | list_observations, create_observations, update_observation, delete_observation |
| Evaluators | list_evaluators, create_evaluators, update_evaluator, delete_evaluator |
| Iterations | list_iterations, get_iteration, get_best_iteration |
| Deployments | get_deployment, deploy, undeploy, get_deployed_prompt |
The client reads PROMPTIC_API_KEY and PROMPTIC_ENDPOINT from the environment, or accepts them as constructor arguments.
CLI
The promptic CLI mirrors the API client and supports both human-readable tables and --json output.
promptic [command] [subcommand] [options]
Commands
| Command | Description |
|---|---|
promptic login |
Authenticate via browser (device flow) |
promptic logout |
Clear saved credentials |
promptic configure |
Save API key and endpoint (CI/CD) |
promptic workspace list |
List accessible workspaces |
promptic workspace select <id> |
Select a workspace |
promptic workspace show |
Show workspace info |
promptic traces list |
List recent traces |
promptic traces get <id> |
Get a trace with spans |
promptic traces stats |
Show aggregated tracing stats |
promptic components list |
List AI components |
promptic components create |
Create a component |
promptic components get <id> |
Get component details |
promptic components delete <id> |
Delete a component |
promptic experiments list |
List experiments |
promptic experiments create |
Create an experiment (interactive) |
promptic experiments get <id> |
Get experiment details |
promptic experiments start <id> |
Start an experiment |
promptic observations list |
List observations for an experiment |
promptic evaluators list |
List evaluators for an experiment |
promptic iterations list |
List iterations for an experiment |
promptic deployments show |
Show deployment for a component |
promptic deployments deploy |
Deploy an experiment |
promptic deployments undeploy |
Remove a deployment |
All list commands support --json for machine-readable output.
Configuration
The SDK and CLI resolve configuration in this order:
- Explicit arguments (
api_key=,endpoint=) - Environment variables (
PROMPTIC_API_KEY,PROMPTIC_ENDPOINT) - Config file (
~/.promptic/config.toml, written bypromptic loginorpromptic configure)
| Variable | Description | Default |
|---|---|---|
PROMPTIC_API_KEY |
API key (for tracing & CI/CD) | — |
PROMPTIC_ENDPOINT |
Platform URL | https://promptic.eu |
Development
Requires Python 3.11+ and uv.
# Install dependencies
uv sync
# Run tests
uv run pytest
# Lint
uv run ruff check .
uv run ruff format .
License
MIT — see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file promptic_sdk-0.3.0.tar.gz.
File metadata
- Download URL: promptic_sdk-0.3.0.tar.gz
- Upload date:
- Size: 29.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
161583dd563766a8ae56b14d4422a1cbb67e9d01343a73061f54c4ce0f7bab05
|
|
| MD5 |
a41b504cd845cff8bcd417c7a6234050
|
|
| BLAKE2b-256 |
97c4f0157feaca030e336027a2e059b81bc0d0e55092281668b76a65ab8a3263
|
Provenance
The following attestation bundles were made for promptic_sdk-0.3.0.tar.gz:
Publisher:
release.yml on prompticeu/promptic-python-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
promptic_sdk-0.3.0.tar.gz -
Subject digest:
161583dd563766a8ae56b14d4422a1cbb67e9d01343a73061f54c4ce0f7bab05 - Sigstore transparency entry: 1123276366
- Sigstore integration time:
-
Permalink:
prompticeu/promptic-python-sdk@4b79b2008e14c97091610e6b5a749d119687168f -
Branch / Tag:
refs/heads/main - Owner: https://github.com/prompticeu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@4b79b2008e14c97091610e6b5a749d119687168f -
Trigger Event:
push
-
Statement type:
File details
Details for the file promptic_sdk-0.3.0-py3-none-any.whl.
File metadata
- Download URL: promptic_sdk-0.3.0-py3-none-any.whl
- Upload date:
- Size: 41.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fbfaabe4fed01affe6591e422512f990d9dac6a9acef0021a9db2a54764f7c6e
|
|
| MD5 |
2b54182b3599ac5851747a9dce6c1e1b
|
|
| BLAKE2b-256 |
d7d7d6d5561ae8b23ddb8185c0a3a8f06fee5a161b10de5eeb357c56640c2bb5
|
Provenance
The following attestation bundles were made for promptic_sdk-0.3.0-py3-none-any.whl:
Publisher:
release.yml on prompticeu/promptic-python-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
promptic_sdk-0.3.0-py3-none-any.whl -
Subject digest:
fbfaabe4fed01affe6591e422512f990d9dac6a9acef0021a9db2a54764f7c6e - Sigstore transparency entry: 1123276371
- Sigstore integration time:
-
Permalink:
prompticeu/promptic-python-sdk@4b79b2008e14c97091610e6b5a749d119687168f -
Branch / Tag:
refs/heads/main - Owner: https://github.com/prompticeu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@4b79b2008e14c97091610e6b5a749d119687168f -
Trigger Event:
push
-
Statement type: