Skip to main content

SDK for importing and transforming agent traces into Trajectory format

Project description

trajectory-sdk

Import agent traces from LangSmith (and other providers) into a standardized Trajectory format.

Install

pip install trajectory-sdk

Quick Start

Individual conversation import

import trajectory_sdk as tj

tj.init(provider="langsmith", api_key="lsv2_pt_...", project_id="...")

# List available conversations
conversations = tj.list_conversations()

# Import and save all conversations
trajectories = tj.import_conversations(conversations)
tj.save(trajectories, "./exports")

Bulk export (E2E)

Export all conversations from a LangSmith project, parse into Trajectories, and upload to GCS + BigQuery in three lines:

import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="my_dataset")

This automatically discovers all trace IDs, triggers a LangSmith bulk export, downloads the parquet from GCS, and parses it into Trajectory objects.

API

tj.init(*, provider, api_key, project_id, storage_dir, debug)

Configure the SDK. Call once before other functions.

tj.init(
    provider="langsmith",        # trace provider (default: "langsmith")
    api_key="lsv2_pt_...",       # provider API key (or set LANGSMITH_API_KEY env var)
    project_id="...",            # provider project/session ID
    workspace_id="...",          # LangSmith workspace/tenant ID (required for bulk export)
    destination_id="...",        # bulk export destination ID (required for bulk export)
    storage_dir="~/.trajectory", # local staging directory (default)
    debug=False,                 # enable debug logging (default: False)
)

tj.list_conversations(*, limit) -> list[ConversationSummary]

List available conversations from the configured provider.

conversations = tj.list_conversations(limit=100)
for c in conversations:
    print(c.conversation_id, c.num_turns)

tj.import_conversations(conversations, *, stage, redactor) -> list[Trajectory]

Import conversations and return one Trajectory per conversation. Accepts a list of conversation ID strings or ConversationSummary objects.

# By ID
trajectories = tj.import_conversations(["cc_abc123", "cc_def456"])

# By ConversationSummary (from list_conversations)
conversations = tj.list_conversations()
trajectories = tj.import_conversations(conversations)

# Bulk export from a local parquet file
trajectories = tj.import_conversations(bulk=True, source="export.parquet")

# Live bulk export (triggers export, downloads, parses)
trajectories = tj.import_conversations(bulk=True)

# With optional PII redaction
trajectories = tj.import_conversations(["cc_abc123"], redactor=my_redactor)

# Without local staging
trajectories = tj.import_conversations(["cc_abc123"], stage=False)

tj.upload(trajectories, dataset)

Upload trajectories to GCS and BigQuery.

tj.upload(trajectories, dataset="my_dataset")

tj.save(trajectories, output_dir)

Save trajectories to local JSON files. Each trajectory is written to {output_dir}/{conversation_id}.json.

# Save all
tj.save(trajectories, "./exports")

Save a single trajectory

tj.save(trajectories[0], "./exports")


## Full Example

```python
import trajectory_sdk as tj

tj.init(
    provider="langsmith",
    api_key="lsv2_pt_...",
    project_id="...",
    workspace_id="...",
    destination_id="...",
)

# Bulk export everything and upload
trajectories = tj.import_conversations(bulk=True)
tj.upload(trajectories, dataset="production_traces")

print(f"Exported {len(trajectories)} trajectories")
for t in trajectories:
    print(f"  {t.task.conversation_id}: {t.task.num_turns} turns, {len(t.steps)} steps")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trajectory_sdk-0.1.9.tar.gz (59.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trajectory_sdk-0.1.9-py3-none-any.whl (75.2 kB view details)

Uploaded Python 3

File details

Details for the file trajectory_sdk-0.1.9.tar.gz.

File metadata

  • Download URL: trajectory_sdk-0.1.9.tar.gz
  • Upload date:
  • Size: 59.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.9.tar.gz
Algorithm Hash digest
SHA256 37c395c7a4635e43c3b5bb2be5b8187480c73ef4d0c936ba584645a096a75f61
MD5 e835cb6f0e11af05a9bb883ad53b2c90
BLAKE2b-256 bb751d611117f15c26ce99f9113ae7d790efa3b2c70f41f4378f53b6f75716b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.9.tar.gz:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file trajectory_sdk-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: trajectory_sdk-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 75.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for trajectory_sdk-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 19de80ae80837502cab9419606305d3647fb7f6a99a96242dc49bc88c853fbdf
MD5 3b66918e7b441d47b987f84b089cc4ba
BLAKE2b-256 107590a6bb0e1bcd1699791cb990b19976be0c7614e3ac0f3d3872808ebcfc0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for trajectory_sdk-0.1.9-py3-none-any.whl:

Publisher: publish-sdk.yml on Trajectorylabs/trajectory-platform

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page