Skip to main content

Basalt SDK for python

Project description

Basalt SDK

Basalt is a powerful tool for managing AI prompts, monitoring AI applications, and their release workflows. This SDK is the official Python package for interacting with your Basalt prompts and monitoring your AI applications.

Installation

Install the Basalt SDK via pip:

pip install basalt-sdk

Optional Instrumentation Dependencies

The SDK includes optional OpenTelemetry instrumentation packages for various LLM providers, vector databases, and frameworks. You can install only the instrumentations you need:

LLM Provider Instrumentations

# Individual providers (7 available)
pip install basalt-sdk[openai]
pip install basalt-sdk[anthropic]
pip install basalt-sdk[google-generativeai]  # Google Gemini
pip install basalt-sdk[bedrock]
pip install basalt-sdk[vertex-ai]
pip install basalt-sdk[mistralai]
pip install basalt-sdk[ollama]

# Multiple providers
pip install basalt-sdk[openai,anthropic]

# All LLM providers
pip install basalt-sdk[llm-all]

Note: The NEW Google GenAI SDK instrumentation (google-genai) is not yet available on PyPI. Use google-generativeai for the existing Gemini SDK.

Vector Database Instrumentations

# Individual vector databases
pip install basalt-sdk[chromadb]
pip install basalt-sdk[pinecone]
pip install basalt-sdk[qdrant]

# All vector databases
pip install basalt-sdk[vector-all]

Framework Instrumentations

# Individual frameworks
pip install basalt-sdk[langchain]
pip install basalt-sdk[llamaindex]

# All frameworks
pip install basalt-sdk[framework-all]

Install Everything

# Install all available instrumentations
pip install basalt-sdk[all]

Note: These instrumentation packages are automatically activated when you enable telemetry in the Basalt SDK. They provide automatic tracing for your LLM provider calls, vector database operations, and framework usage.

Usage

Importing and Initializing the SDK

To get started, import the Basalt class and initialize it with your API key. Telemetry is enabled by default via OpenTelemetry, but can be configured or disabled:

from basalt import Basalt, TelemetryConfig

# Basic initialization with API key
basalt = Basalt(api_key="my-dev-api-key")

# Disable all telemetry
basalt = Basalt(api_key="my-dev-api-key", enable_telemetry=False)

# Advanced telemetry configuration
telemetry = TelemetryConfig(
    service_name="my-app",
    environment="staging",
    enabled_providers=["openai", "anthropic"],  # Optional: selective instrumentation
)
basalt = Basalt(api_key="my-dev-api-key", telemetry_config=telemetry)

# Or use client-level parameters (simpler)
basalt = Basalt(
    api_key="my-dev-api-key",
    enabled_instruments=["openai", "anthropic"]
)

# Configure global metadata when constructing the client
basalt = Basalt(
    api_key="my-dev-api-key",
    observability_metadata={"env": "staging"},
)

# Don't forget to shutdown the client when done
# This flushes any pending telemetry data
basalt.shutdown()

See examples/async_observe_example.py for a more complete walkthrough covering decorators, context managers, and observability patterns.

Telemetry & Observability

The SDK includes comprehensive OpenTelemetry integration for observability:

  • TelemetryConfig centralizes all observability options including:
    • Service name/version and deployment environment
    • Custom exporter configuration
    • Lightweight tracing wrappers for Basalt API calls (bring your own HTTP instrumentation if you need transport-level spans)
    • LLM provider instrumentation with fine-grained control over which providers to instrument
  • Quick disable via enable_telemetry=False bypasses all instrumentation without touching application code.
  • Built-in decorators and context managers simplify manual span creation:
    • Root Spans: @start_observe - Creates trace entry point with identity and experiment tracking
    • Nested Spans: @observe with kind parameter - For generation, retrieval, tool, event, function spans
from basalt.observability import observe, start_observe


# Root span with identity tracking
@start_observe(
    feature_slug="dataset-processing",
    name="process_workflow",
    identity={
        "organization": {"id": "123", "name": "ACME"},
        "user": {"id": "456", "name": "John Doe"}
    },
    metadata={"environment": "production"}
)
def process_dataset(slug: str, user_id: str) -> str:
    # Identity automatically propagates to child spans
    observe.set_input({"slug": slug})
    result = f"processed:{slug}"
    observe.set_output({"result": result})
    return result


# Nested LLM span
@observe(kind="generation", name="llm.generate")
def generate_summary(model: str, prompt: str) -> dict:
    # Your LLM call here
    return {"choices": [{"message": {"content": "Summary"}}]}

Supported environment variables:

Variable Description
BASALT_API_KEY API key for authentication (can also be passed to Basalt() constructor).
BASALT_API_URL Override the Basalt API base URL (default: https://api.getbasalt.ai or http://localhost:3001 in development).
BASALT_TELEMETRY_ENABLED Master switch to enable/disable telemetry (default: true).
BASALT_SERVICE_NAME Overrides the OTEL service.name.
BASALT_ENVIRONMENT Sets deployment.environment.
BASALT_OTEL_EXPORTER_OTLP_ENDPOINT Custom OTLP endpoint for traces. Overrides the default Basalt OTEL collector endpoint.
BASALT_BUILD SDK build mode - set to development for local OTEL collector testing (default: production).
BASALT_SAMPLE_RATE Global default sampling rate for trace-level evaluation (0.0-1.0, default: 0.0).
TRACELOOP_TRACE_CONTENT Controls whether prompts/completions are logged. Note: Set automatically by TelemetryConfig.trace_content - you typically don't need to set this manually.
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT Controls message content capture for Google GenAI instrumentation. Set automatically by TelemetryConfig.trace_content.
BASALT_ENABLED_INSTRUMENTS Comma-separated list of instruments to enable (e.g., openai,anthropic).
BASALT_DISABLED_INSTRUMENTS Comma-separated list of instruments to disable (e.g., langchain,llamaindex).

Default OTLP Exporter:

By default, the SDK automatically sends traces to Basalt's OTEL collector:

  • Production: https://grpc.otel.getbasalt.ai (gRPC)
  • Development: http://127.0.0.1:4317 (gRPC, when BASALT_BUILD=development)

You can override this by:

  1. Providing a custom exporter in TelemetryConfig
  2. Setting the BASALT_OTEL_EXPORTER_OTLP_ENDPOINT environment variable
  3. Disabling telemetry with enable_telemetry=False

Prompt SDK

The Prompt SDK allows you to interact with your Basalt prompts using an exception-based API for clear error handling.

For a complete working example, check out:

Available Methods

Prompts

Your Basalt instance exposes a prompts property for interacting with your Basalt prompts:

  • List Prompts

    Retrieve all available prompts.

    Example Usage:

    from basalt import Basalt
    from basalt.types.exceptions import BasaltAPIError, UnauthorizedError
    
    basalt = Basalt(api_key="your-api-key")
    
    try:
        prompts = basalt.prompts.list_sync()
        for prompt in prompts:
            print(f"{prompt.slug} - {prompt.name}")
    except UnauthorizedError:
        print("Invalid API key")
    except BasaltAPIError as e:
        print(f"API error: {e}")
    
  • Get a Prompt

    Retrieve a specific prompt using a slug, and optional filters tag and version. Without tag or version, the production version of your prompt is selected by default.

    Example Usage:

    from basalt import Basalt
    from basalt.types.exceptions import NotFoundError, BasaltAPIError
    
    basalt = Basalt(api_key="your-api-key")
    
    try:
        # Get the production version
        prompt = basalt.prompts.get_sync('prompt-slug')
        print(prompt.text)
    
        # With optional tag or version parameters
        prompt = basalt.prompts.get_sync(slug='prompt-slug', tag='latest')
        prompt = basalt.prompts.get_sync(slug='prompt-slug', version='1.0.0')
    
        # If your prompt has variables, pass them when fetching
        prompt = basalt.prompts.get_sync(
            slug='prompt-slug',
            variables={'name': 'John Doe', 'role': 'engineer'}
        )
    
        # Use the prompt with your AI provider of choice
        # Example: OpenAI
        import openai
        client = openai.OpenAI()
        
        response = client.chat.completions.create(
            model='gpt-4',
            messages=[{'role': 'user', 'content': prompt.text}]
        )
        print(response.choices[0].message.content)
    
    except NotFoundError:
        print('Prompt not found')
    except BasaltAPIError as e:
        print(f'API error: {e}')
    finally:
        basalt.shutdown()
    
  • Context Managers for Observability (Recommended)

    Use prompts as context managers to automatically nest LLM calls under a prompt span for better trace organization and observability:

    Sync Example:

    from basalt import Basalt
    import openai
    
    basalt = Basalt(api_key="your-api-key")
    client = openai.OpenAI()
    
    # Use context manager for automatic span nesting
    with basalt.prompts.get_sync('summary-prompt', tag='production') as prompt:
        response = client.chat.completions.create(
            model=prompt.model.model,
            messages=[{'role': 'user', 'content': prompt.text}]
        )
        print(response.choices[0].message.content)
    
    basalt.shutdown()
    

    Async Example:

    import asyncio
    from basalt import Basalt
    import openai
    
    async def generate():
        basalt = Basalt(api_key="your-api-key")
        client = openai.AsyncOpenAI()
        
        async with await basalt.prompts.get('summary-prompt', tag='production') as prompt:
            response = await client.chat.completions.create(
                model=prompt.model.model,
                messages=[{'role': 'user', 'content': prompt.text}]
            )
            print(response.choices[0].message.content)
        
        basalt.shutdown()
    
    asyncio.run(generate())
    

See the Basalt documentation for complete details.

  • Describe a Prompt

    Get metadata about a prompt including available versions and tags.

    Example Usage:

    try:
        description = basalt.prompts.describe_sync('prompt-slug')
        print(f"Available versions: {description.available_versions}")
        print(f"Available tags: {description.available_tags}")
    except NotFoundError:
        print('Prompt not found')
    
  • Async Operations

    All methods have async variants (no suffix) and sync variants with _sync suffix:

    import asyncio
    
    async def fetch_prompts():
        basalt = Basalt(api_key="your-api-key")
        
        try:
            # List prompts asynchronously
            prompts = await basalt.prompts.list()
            
            # Get a specific prompt asynchronously
            prompt = await basalt.prompts.get('prompt-slug')
            
            # Describe a prompt asynchronously
            description = await basalt.prompts.describe('prompt-slug')
            
        finally:
            basalt.shutdown()
    
    asyncio.run(fetch_prompts())
    

Dataset SDK

The Dataset SDK allows you to interact with your Basalt datasets using an exception-based API for clear error handling.

For a complete working example, check out:

Available Methods

Datasets

Your Basalt instance exposes a datasets property for interacting with your Basalt datasets:

  • List Datasets

    Retrieve all available datasets.

    Example Usage:

    from basalt import Basalt
    from basalt.types.exceptions import BasaltAPIError
    
    basalt = Basalt(api_key="your-api-key")
    
    try:
        datasets = basalt.datasets.list_sync()
        for dataset in datasets:
            print(f"{dataset.slug} - {dataset.name}")
            print(f"Columns: {dataset.columns}")
    except BasaltAPIError as e:
        print(f"API error: {e}")
    
  • Get a Dataset

    Retrieve a specific dataset by slug.

    Example Usage:

    from basalt.types.exceptions import NotFoundError
    
    try:
        dataset = basalt.datasets.get_sync('dataset-slug')
        print(f"Dataset: {dataset.name}")
        print(f"Rows: {len(dataset.rows)}")
        
        # Access dataset rows
        for row in dataset.rows:
            print(row)
            
    except NotFoundError:
        print('Dataset not found')
    
  • Async Operations

    All methods have async variants (no suffix) and sync variants with _sync suffix:

    import asyncio
    
    async def fetch_datasets():
        basalt = Basalt(api_key="your-api-key")
        
        try:
            # List datasets asynchronously
            datasets = await basalt.datasets.list()
            
            # Get a specific dataset asynchronously
            dataset = await basalt.datasets.get('dataset-slug')
            
        finally:
            basalt.shutdown()
    
    asyncio.run(fetch_datasets())
    

Error Handling

The SDK uses exception-based error handling for clarity and pythonic patterns:

from basalt import Basalt
from basalt.types.exceptions import (
    BasaltAPIError,      # Base exception for all API errors
    BadRequestError,     # Invalid request (400)
    UnauthorizedError,   # Authentication failed (401)
    ForbiddenError,      # Permission denied (403)
    NotFoundError,       # Resource not found (404)
    NetworkError,        # Network/connection errors
    FileUploadError,     # File upload to S3 failed
    FileValidationError, # File validation failed before upload
)

basalt = Basalt(api_key="your-api-key")

try:
    prompt = basalt.prompts.get_sync('my-prompt')
    # Use the prompt
except NotFoundError:
    print("Prompt doesn't exist")
except UnauthorizedError:
    print("Check your API key")
except ForbiddenError:
    print("You don't have permission to access this resource")
except NetworkError:
    print("Network connection failed")
except BasaltAPIError as e:
    print(f"Other API error: {e}")
finally:
    basalt.shutdown()

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

basalt_sdk-1.1.9.tar.gz (117.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

basalt_sdk-1.1.9-py3-none-any.whl (88.1 kB view details)

Uploaded Python 3

File details

Details for the file basalt_sdk-1.1.9.tar.gz.

File metadata

  • Download URL: basalt_sdk-1.1.9.tar.gz
  • Upload date:
  • Size: 117.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.3 cpython/3.14.2 HTTPX/0.28.1

File hashes

Hashes for basalt_sdk-1.1.9.tar.gz
Algorithm Hash digest
SHA256 5be964bd0276df5a6d6b0df538e2eb5da8ade86543954848e2b16daeae2d1f90
MD5 56933a9b47f8f267f2e02d21fa9bd7bc
BLAKE2b-256 95c4c0fc1ebc00fca4cb75b6f7dcd4b6d5015591783d977cda6b145512a8fd2b

See more details on using hashes here.

File details

Details for the file basalt_sdk-1.1.9-py3-none-any.whl.

File metadata

  • Download URL: basalt_sdk-1.1.9-py3-none-any.whl
  • Upload date:
  • Size: 88.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.3 cpython/3.14.2 HTTPX/0.28.1

File hashes

Hashes for basalt_sdk-1.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 b9b98fee11c540d837d6a9dcd1b791f5fa4c52223800bfde4ab83b088ccbff22
MD5 d8206d6f83ee672d375ac95fca05dad1
BLAKE2b-256 0889c24d115d2752cf60b490531662a3a0aafca8b4313513fb6afb93cafc532d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page