Skip to main content

SDK for importing and transforming agent traces into Trajectory format

Project description

trajectory-sdk

Import agent traces from LangSmith (and other providers) into a standardized Trajectory format.

Install

pip install trajectory-sdk

Quick Start

Individual conversation import

import trajectory_sdk as tj

tj.init(provider="langsmith", api_key="lsv2_pt_...", project_id="...")

# List available conversations
conversations = tj.list_conversations()

# Import and save all conversations
trajectories = tj.import_conversations(conversations)
tj.save(trajectories, "./exports")

Bulk export (E2E)

Export all conversations from a LangSmith project, parse into Trajectories, and upload to GCS + BigQuery in three lines:

import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="my_dataset")

This automatically discovers all trace IDs, triggers a LangSmith bulk export, downloads the parquet from GCS, and parses it into Trajectory objects.

API

tj.init(*, provider, api_key, project_id, storage_dir, debug)

Configure the SDK. Call once before other functions.

tj.init(
    provider="langsmith",        # trace provider (default: "langsmith")
    api_key="lsv2_pt_...",       # provider API key (or set LANGSMITH_API_KEY env var)
    project_id="...",            # provider project/session ID
    workspace_id="...",          # LangSmith workspace/tenant ID (required for bulk export)
    destination_id="...",        # bulk export destination ID (required for bulk export)
    storage_dir="~/.trajectory", # local staging directory (default)
    debug=False,                 # enable debug logging (default: False)
)

tj.list_conversations(*, limit) -> list[ConversationSummary]

List available conversations from the configured provider.

conversations = tj.list_conversations(limit=100)
for c in conversations:
    print(c.conversation_id, c.num_turns)

tj.import_conversations(conversations, *, stage, redactor) -> list[Trajectory]

Import conversations and return one Trajectory per conversation. Accepts a list of conversation ID strings or ConversationSummary objects.

# By ID
trajectories = tj.import_conversations(["cc_abc123", "cc_def456"])

# By ConversationSummary (from list_conversations)
conversations = tj.list_conversations()
trajectories = tj.import_conversations(conversations)

# Bulk export from a local parquet file
trajectories = tj.import_conversations(bulk=True, source="export.parquet")

# Live bulk export (triggers export, downloads, parses)
trajectories = tj.import_conversations(bulk=True)

# With optional PII redaction
trajectories = tj.import_conversations(["cc_abc123"], redactor=my_redactor)

# Without local staging
trajectories = tj.import_conversations(["cc_abc123"], stage=False)

tj.upload(trajectories, dataset)

Upload trajectories to GCS and BigQuery.

tj.upload(trajectories, dataset="my_dataset")

tj.save(trajectories, output_dir)

Save trajectories to local JSON files. Each trajectory is written to {output_dir}/{conversation_id}.json.

# Save all
tj.save(trajectories, "./exports")

Save a single trajectory

tj.save(trajectories[0], "./exports")


## Full Example

```python
import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)

# Bulk export everything and upload
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="production_traces")

print(f"Exported {len(trajectories)} trajectories")
for t in trajectories:
    print(f"  {t.task.conversation_id}: {t.task.num_turns} turns, {len(t.steps)} steps")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trajectory_sdk-0.1.8.tar.gz (59.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trajectory_sdk-0.1.8-py3-none-any.whl (75.2 kB view details)

Uploaded Python 3

File details

Details for the file trajectory_sdk-0.1.8.tar.gz.

File metadata

  • Download URL: trajectory_sdk-0.1.8.tar.gz
  • Upload date:
  • Size: 59.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.8.tar.gz
Algorithm Hash digest
SHA256 e4bd99d1ba668b5da17d82de93c38b056b48dbf2c023e91f35ec4e3f9a37be78
MD5 09a4c0defdbc4c058c75347365aa47c2
BLAKE2b-256 f8c7cbbc5c3967b70f6c8f593454095900c7e81424cedb20bf57a1c661590a72

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.8.tar.gz:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file trajectory_sdk-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: trajectory_sdk-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 75.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 fd4c23300de24e3c310babde7358dd4cbd9f3bc2ebd03ececb9d2f63c9f67794
MD5 4f6f21e3580c46727a8f82875ba48615
BLAKE2b-256 3445458d71adb278f37bf3e4dfb791bc46b8a915a14e8871ee1ec7db9060358e

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.8-py3-none-any.whl:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page