Skip to main content

SDK for importing and transforming agent traces into Trajectory format

Project description

trajectory-sdk

Import agent traces from LangSmith (and other providers) into a standardized Trajectory format.

Install

pip install trajectory-sdk

Quick Start

Individual conversation import

import trajectory_sdk as tj

tj.init(provider="langsmith", api_key="lsv2_pt_...", project_id="...")

# List available conversations
conversations = tj.list_conversations()

# Import and save all conversations
trajectories = tj.import_conversations(conversations)
tj.save(trajectories, "./exports")

Bulk export (E2E)

Export all conversations from a LangSmith project, parse into Trajectories, and upload to GCS + BigQuery in three lines:

import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="my_dataset")

This automatically discovers all trace IDs, triggers a LangSmith bulk export, downloads the parquet from GCS, and parses it into Trajectory objects.

API

tj.init(*, provider, api_key, project_id, storage_dir, debug)

Configure the SDK. Call once before other functions.

tj.init(
    provider="langsmith",        # trace provider (default: "langsmith")
    api_key="lsv2_pt_...",       # provider API key (or set LANGSMITH_API_KEY env var)
    project_id="...",            # provider project/session ID
    workspace_id="...",          # LangSmith workspace/tenant ID (required for bulk export)
    destination_id="...",        # bulk export destination ID (required for bulk export)
    storage_dir="~/.trajectory", # local staging directory (default)
    debug=False,                 # enable debug logging (default: False)
)

tj.list_conversations(*, limit) -> list[ConversationSummary]

List available conversations from the configured provider.

conversations = tj.list_conversations(limit=100)
for c in conversations:
    print(c.conversation_id, c.num_turns)

tj.import_conversations(conversations, *, stage, redactor) -> list[Trajectory]

Import conversations and return one Trajectory per conversation. Accepts a list of conversation ID strings or ConversationSummary objects.

# By ID
trajectories = tj.import_conversations(["cc_abc123", "cc_def456"])

# By ConversationSummary (from list_conversations)
conversations = tj.list_conversations()
trajectories = tj.import_conversations(conversations)

# Bulk export from a local parquet file
trajectories = tj.import_conversations(bulk=True, source="export.parquet")

# Live bulk export (triggers export, downloads, parses)
trajectories = tj.import_conversations(bulk=True)

# With optional PII redaction
trajectories = tj.import_conversations(["cc_abc123"], redactor=my_redactor)

# Without local staging
trajectories = tj.import_conversations(["cc_abc123"], stage=False)

tj.upload(trajectories, dataset)

Upload trajectories to GCS and BigQuery.

tj.upload(trajectories, dataset="my_dataset")

tj.save(trajectories, output_dir)

Save trajectories to local JSON files. Each trajectory is written to {output_dir}/{conversation_id}.json.

# Save all
tj.save(trajectories, "./exports")

Save a single trajectory

tj.save(trajectories[0], "./exports")


## Full Example

```python
import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)

# Bulk export everything and upload
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="production_traces")

print(f"Exported {len(trajectories)} trajectories")
for t in trajectories:
    print(f"  {t.task.conversation_id}: {t.task.num_turns} turns, {len(t.steps)} steps")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trajectory_sdk-0.1.7.tar.gz (58.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trajectory_sdk-0.1.7-py3-none-any.whl (73.8 kB view details)

Uploaded Python 3

File details

Details for the file trajectory_sdk-0.1.7.tar.gz.

File metadata

  • Download URL: trajectory_sdk-0.1.7.tar.gz
  • Upload date:
  • Size: 58.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.7.tar.gz
Algorithm Hash digest
SHA256 6833a568b4fa337acc457d8e0a098b7893ecab698791166ba05390e3a1edfec2
MD5 18b2a0653fcf6246927a41260c5f6cd2
BLAKE2b-256 4cc38e4b18f496507c7d47f659a5b6c14a2c27626603dbfff991ea50b27bdda0

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.7.tar.gz:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file trajectory_sdk-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: trajectory_sdk-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 73.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 f532b119f250e752626910d86a101761d999d2fce0cb4409e9aaddc12548791e
MD5 b5576fa6ac3ec719d07e83a2ed69809e
BLAKE2b-256 2d01b06b4c63c119528915128e520038b616bd5a0fe455e6b004e26f0192d5a6

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.7-py3-none-any.whl:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page