Skip to main content

Python SDK for the Promptic platform — tracing, API client, and CLI.

Project description

Promptic Python SDK

Python SDK and CLI for the Promptic platform — tracing, prompt optimization, and experiment management.

Installation

pip install promptic-sdk

Optional LLM instrumentation

Install extras to auto-instrument specific LLM providers:

pip install promptic-sdk[openai]       # OpenAI
pip install promptic-sdk[anthropic]    # Anthropic
pip install promptic-sdk[langchain]    # LangChain
pip install promptic-sdk[all]          # All providers

Quick start

1. Authenticate

Log in via browser (recommended for local development):

promptic login

This opens your browser for authentication, then auto-selects your workspace. Credentials are saved to ~/.promptic/config.toml.

For CI/CD or headless environments, use an API key instead:

promptic configure
# or set the environment variable:
export PROMPTIC_API_KEY="pk_..."

2. Send traces

import promptic_sdk
from openai import OpenAI

# Initialize tracing (auto-instruments installed LLM libraries)
promptic_sdk.init()

client = OpenAI()

# Tag traces with an AI Component name
with promptic_sdk.ai_component("customer-support-agent"):
    response = client.chat.completions.create(
        model="gpt-4.1-nano",
        messages=[{"role": "user", "content": "Hello!"}],
    )

3. Use the API client

from promptic_sdk import PrompticClient

with PrompticClient() as client:
    # List traces
    traces = client.list_traces(limit=10)

    # Get workspace info
    workspace = client.get_workspace()

    # Manage experiments
    experiment = client.create_experiment(
        ai_component_id="comp_...",
        target_model="gpt-4.1-nano",
        task_type="classification",
        initial_prompt="Classify the following text.",
    )

    # Deploy the best prompt
    client.deploy(component_id="comp_...", experiment_id="exp_...")

    # Fetch a deployed prompt at runtime
    prompt = client.get_deployed_prompt("comp_...")

Tracing

promptic_sdk.init() sets up OpenTelemetry to export spans to the Promptic platform.

Parameter Description Default
api_key Promptic API key (falls back to PROMPTIC_API_KEY)
endpoint Platform URL (falls back to PROMPTIC_ENDPOINT) https://promptic.eu
auto_instrument Auto-detect and instrument LLM client libraries True
service_name OpenTelemetry service.name resource attribute

Auto-detected instrumentors: OpenAI, Anthropic, Google Generative AI, LangChain, Cohere.

Using other OpenTelemetry instrumentors

Since Promptic uses standard OpenTelemetry under the hood, you can add any OTel-compatible instrumentor alongside the auto-detected ones. Just call promptic_sdk.init() first, then instrument manually:

import promptic_sdk
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor

promptic_sdk.init()

# Add any OpenTelemetry instrumentor — spans will be exported to Promptic
RequestsInstrumentor().instrument()
SQLAlchemyInstrumentor().instrument(engine=engine)

This works with any package from the opentelemetry-python-contrib ecosystem (HTTP clients, databases, web frameworks, etc.). All spans are exported to the Promptic platform as long as init() has been called.

AI Components

Use ai_component() to tag spans with a component name. The platform links traces to the matching AI Component in your workspace:

with promptic_sdk.ai_component("my-component"):
    # All LLM calls here are tagged
    ...

API client

Both a sync (PrompticClient) and async (AsyncPrompticClient) client are available. They share the same method signatures and return types.

from promptic_sdk import PrompticClient

with PrompticClient() as client:
    traces = client.list_traces(limit=10)
from promptic_sdk import AsyncPrompticClient

async with AsyncPrompticClient() as client:
    traces = await client.list_traces(limit=10)

Both clients provide typed methods for the full Promptic REST API:

Resource Methods
Workspace get_workspace
Traces list_traces, get_trace, get_stats
Components list_components, get_component, create_component, delete_component
Experiments list_experiments, get_experiment, create_experiment, update_experiment, delete_experiment, start_experiment
Observations list_observations, create_observations, update_observation, delete_observation
Evaluators list_evaluators, create_evaluators, update_evaluator, delete_evaluator
Iterations list_iterations, get_iteration, get_best_iteration
Deployments get_deployment, deploy, undeploy, get_deployed_prompt

The client reads PROMPTIC_API_KEY and PROMPTIC_ENDPOINT from the environment, or accepts them as constructor arguments.

CLI

The promptic CLI mirrors the API client and supports both human-readable tables and --json output.

promptic [command] [subcommand] [options]

Commands

Command Description
promptic login Authenticate via browser (device flow)
promptic logout Clear saved credentials
promptic configure Save API key and endpoint (CI/CD)
promptic workspace list List accessible workspaces
promptic workspace select <id> Select a workspace
promptic workspace info Show workspace info
promptic traces list List recent traces
promptic traces get <id> Get a trace with spans
promptic traces stats Show aggregated tracing stats
promptic components list List AI components
promptic components create Create a component
promptic components get <id> Get component details
promptic components delete <id> Delete a component
promptic experiments list List experiments
promptic experiments create Create an experiment (interactive)
promptic experiments get <id> Get experiment details
promptic experiments update <id> Update an experiment
promptic experiments delete <id> Delete an experiment
promptic experiments start <id> Start an experiment
promptic observations list List observations for an experiment
promptic observations add Add an observation
promptic observations delete <id> Delete an observation
promptic evaluators list List evaluators for an experiment
promptic evaluators add Add an evaluator
promptic evaluators delete <id> Delete an evaluator
promptic iterations list List iterations for an experiment
promptic iterations get <id> Get iteration details
promptic iterations best Get the best iteration
promptic deployments status <id> Show deployment for a component
promptic deployments deploy Deploy an experiment
promptic deployments prompt <id> Show the deployed prompt
promptic deployments undeploy <id> Remove a deployment
promptic datasets create Create a dataset
promptic datasets list List datasets
promptic datasets get <id> Get dataset details
promptic datasets delete <id> Delete a dataset
promptic runs create Create a run
promptic runs list List runs
promptic runs get <id> Get run details
promptic runs delete <id> Delete a run
promptic annotations create Create an annotation
promptic annotations list List annotations
promptic annotations delete <id> Delete an annotation
promptic evaluations run Run an evaluation
promptic evaluations list List evaluations
promptic evaluations get <id> Get evaluation details

All list commands support --json for machine-readable output.

Configuration

The SDK and CLI resolve configuration in this order:

  1. Explicit arguments (api_key=, endpoint=)
  2. Environment variables (PROMPTIC_API_KEY, PROMPTIC_ENDPOINT)
  3. Config file (~/.promptic/config.toml, written by promptic login or promptic configure)
Variable Description Default
PROMPTIC_API_KEY API key (for tracing & CI/CD)
PROMPTIC_ENDPOINT Platform URL https://promptic.eu

Development

Requires Python 3.11+ and uv.

# Install dependencies
uv sync

# Run tests
uv run pytest

# Lint
uv run ruff check .
uv run ruff format .

License

MIT — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptic_sdk-0.11.1.tar.gz (30.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptic_sdk-0.11.1-py3-none-any.whl (42.3 kB view details)

Uploaded Python 3

File details

Details for the file promptic_sdk-0.11.1.tar.gz.

File metadata

  • Download URL: promptic_sdk-0.11.1.tar.gz
  • Upload date:
  • Size: 30.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for promptic_sdk-0.11.1.tar.gz
Algorithm Hash digest
SHA256 382e2536682fae91eedb05e9303271cb6dd26172218da19430104b9d2d70077a
MD5 00b51b338af1093ae65a7eef2043e82c
BLAKE2b-256 d40a6f30cda9b777642ac31d7e83aaf354110f086334062f33f60215f26e970b

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptic_sdk-0.11.1.tar.gz:

Publisher: release.yml on prompticeu/promptic-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file promptic_sdk-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: promptic_sdk-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 42.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for promptic_sdk-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 750b45151b8d3eac10180f33e2e5fd6afb520d5eda95ff9eafa209d410076dc7
MD5 349ae6070493d849284c1ef3980e9eb7
BLAKE2b-256 b92a7460a9761128256d4ced05f8a39af5a84404cf5cf4ecd1971a142f75b594

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptic_sdk-0.11.1-py3-none-any.whl:

Publisher: release.yml on prompticeu/promptic-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page