Skip to main content

Python SDK for the Promptic platform — tracing, API client, and CLI.

Project description

Promptic Python SDK

Python SDK and CLI for the Promptic platform — tracing, prompt optimization, and experiment management.

Installation

pip install promptic-sdk

Optional LLM instrumentation

Install extras to auto-instrument specific LLM providers:

pip install promptic-sdk[openai]       # OpenAI
pip install promptic-sdk[anthropic]    # Anthropic
pip install promptic-sdk[langchain]    # LangChain
pip install promptic-sdk[all]          # All providers

Quick start

1. Authenticate

Log in via browser (recommended for local development):

promptic login

This opens your browser for authentication, then auto-selects your workspace. Credentials are saved to ~/.promptic/config.toml.

For CI/CD or headless environments, use an API key instead:

promptic configure
# or set the environment variable:
export PROMPTIC_API_KEY="pk_..."

2. Send traces

import promptic_sdk
from openai import OpenAI

# Initialize tracing (auto-instruments installed LLM libraries)
promptic_sdk.init()

client = OpenAI()

# Tag traces with an AI Component name
with promptic_sdk.ai_component("customer-support-agent"):
    response = client.chat.completions.create(
        model="gpt-4.1-nano",
        messages=[{"role": "user", "content": "Hello!"}],
    )

3. Use the API client

from promptic_sdk import PrompticClient

with PrompticClient() as client:
    # List traces
    traces = client.list_traces(limit=10)

    # Get workspace info
    workspace = client.get_workspace()

    # Manage experiments
    experiment = client.create_experiment(
        ai_component_id="comp_...",
        target_model="gpt-4.1-nano",
        task_type="classification",
        initial_prompt="Classify the following text.",
    )

    # Deploy the best prompt
    client.deploy(component_id="comp_...", experiment_id="exp_...")

    # Fetch a deployed prompt at runtime
    prompt = client.get_deployed_prompt("comp_...")

Tracing

promptic_sdk.init() sets up OpenTelemetry to export spans to the Promptic platform.

Parameter Description Default
api_key Promptic API key (falls back to PROMPTIC_API_KEY)
endpoint Platform URL (falls back to PROMPTIC_ENDPOINT) https://promptic.eu
auto_instrument Auto-detect and instrument LLM client libraries True
service_name OpenTelemetry service.name resource attribute

Auto-detected instrumentors: OpenAI, Anthropic, Google Generative AI, LangChain, Cohere.

Using other OpenTelemetry instrumentors

Since Promptic uses standard OpenTelemetry under the hood, you can add any OTel-compatible instrumentor alongside the auto-detected ones. Just call promptic_sdk.init() first, then instrument manually:

import promptic_sdk
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor

promptic_sdk.init()

# Add any OpenTelemetry instrumentor — spans will be exported to Promptic
RequestsInstrumentor().instrument()
SQLAlchemyInstrumentor().instrument(engine=engine)

This works with any package from the opentelemetry-python-contrib ecosystem (HTTP clients, databases, web frameworks, etc.). All spans are exported to the Promptic platform as long as init() has been called.

AI Components

Use ai_component() to tag spans with a component name. The platform links traces to the matching AI Component in your workspace:

with promptic_sdk.ai_component("my-component"):
    # All LLM calls here are tagged
    ...

API client

Both a sync (PrompticClient) and async (AsyncPrompticClient) client are available. They share the same method signatures and return types.

from promptic_sdk import PrompticClient

with PrompticClient() as client:
    traces = client.list_traces(limit=10)
from promptic_sdk import AsyncPrompticClient

async with AsyncPrompticClient() as client:
    traces = await client.list_traces(limit=10)

Both clients provide typed methods for the full Promptic REST API:

Resource Methods
Workspace get_workspace
Traces list_traces, get_trace, get_stats
Components list_components, get_component, create_component, delete_component
Experiments list_experiments, get_experiment, create_experiment, update_experiment, delete_experiment, start_experiment
Observations list_observations, create_observations, update_observation, delete_observation
Evaluators list_evaluators, create_evaluators, update_evaluator, delete_evaluator
Iterations list_iterations, get_iteration, get_best_iteration
Deployments get_deployment, deploy, undeploy, get_deployed_prompt

The client reads PROMPTIC_API_KEY and PROMPTIC_ENDPOINT from the environment, or accepts them as constructor arguments.

CLI

The promptic CLI mirrors the API client and supports both human-readable tables and --json output.

promptic [command] [subcommand] [options]

Commands

Command Description
promptic login Authenticate via browser (device flow)
promptic logout Clear saved credentials
promptic configure Save API key and endpoint (CI/CD)
promptic workspace list List accessible workspaces
promptic workspace select <id> Select a workspace
promptic workspace show Show workspace info
promptic traces list List recent traces
promptic traces get <id> Get a trace with spans
promptic traces stats Show aggregated tracing stats
promptic components list List AI components
promptic components create Create a component
promptic components get <id> Get component details
promptic components delete <id> Delete a component
promptic experiments list List experiments
promptic experiments create Create an experiment (interactive)
promptic experiments get <id> Get experiment details
promptic experiments start <id> Start an experiment
promptic observations list List observations for an experiment
promptic evaluators list List evaluators for an experiment
promptic iterations list List iterations for an experiment
promptic deployments show Show deployment for a component
promptic deployments deploy Deploy an experiment
promptic deployments undeploy Remove a deployment

All list commands support --json for machine-readable output.

Configuration

The SDK and CLI resolve configuration in this order:

  1. Explicit arguments (api_key=, endpoint=)
  2. Environment variables (PROMPTIC_API_KEY, PROMPTIC_ENDPOINT)
  3. Config file (~/.promptic/config.toml, written by promptic login or promptic configure)
Variable Description Default
PROMPTIC_API_KEY API key (for tracing & CI/CD)
PROMPTIC_ENDPOINT Platform URL https://promptic.eu

Development

Requires Python 3.11+ and uv.

# Install dependencies
uv sync

# Run tests
uv run pytest

# Lint
uv run ruff check .
uv run ruff format .

License

MIT — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptic_sdk-0.7.0.tar.gz (29.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptic_sdk-0.7.0-py3-none-any.whl (41.8 kB view details)

Uploaded Python 3

File details

Details for the file promptic_sdk-0.7.0.tar.gz.

File metadata

  • Download URL: promptic_sdk-0.7.0.tar.gz
  • Upload date:
  • Size: 29.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for promptic_sdk-0.7.0.tar.gz
Algorithm Hash digest
SHA256 4324dcf51bac5fa2e7442d2c880d4e93fc9ef39c905f06f942e9559676e00a36
MD5 b9943cca1da88bc5429e1e1fcf185ab3
BLAKE2b-256 4a1fae0466dbb133a3193c2d6f4cc3a383a71512c002e38dfb13866b8eda0c87

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptic_sdk-0.7.0.tar.gz:

Publisher: release.yml on prompticeu/promptic-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file promptic_sdk-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: promptic_sdk-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 41.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for promptic_sdk-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 71d471228b0d9ed2a00ad0a62787b2e38dff3bfb827ba73f555fdf1d32c1e020
MD5 95cdb27d2a6f9aef3267dc667a98d1da
BLAKE2b-256 8de89e77344efc88fd64b88daae75cc4ca684d0dde64c05ea8b0cc0f320e9664

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptic_sdk-0.7.0-py3-none-any.whl:

Publisher: release.yml on prompticeu/promptic-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page