Skip to main content

Tensorlake SDK for Document Ingestion API and Serverless Applications

Project description

Group 39884

Build agents with sandboxes and serverless orchestration runtime

PyPI Version Python Support License Documentation Slack

Tensorlake is a compute infrastructure platform for building agentic applications with sandboxes.

The Sandbox API creates MicroVM sandboxes which you can use to run agents, or use them as an isolated environment for running tools or LLM generated code.

In addition to stateful VMs, you can also add long running orchestration capabilites to Agents using a serverless funtion runtime with fan-out capabilities.

Sandboxes

Tensorlake Sandboxes are stateful Firecracker MicroVMs built for instant, stateful execution environments for AI agents — spin up millions of VMs with near-SSD filesystem performance.

Key capabilities

  • Fastest Filesystem I/O — Block-based storage achieving near-SSD speeds inside virtual machines. In SQLite benchmarks (2 vCPUs, 4 GB RAM), Tensorlake completes in 2.45s vs Vercel 3.00s (1.2×), E2B 3.92s (1.6×), Modal 4.66s (1.9×), and Daytona 5.51s (2.2×).
  • Fast startup — Sandboxes created in under a second via Lattice, a dynamic cluster scheduler.
  • Snapshots & cloning — Snapshot at any point to create durable memory and filesystem checkpoints; clone running sandboxes instantaneously across machines.
  • Auto suspend/resume — Sandboxes suspend when idle and resume in under a second without losing any memory or filesystem state.
  • Live migration — Sandboxes automatically move between machines during updates with only a brief pause of a few seconds.
  • Scale — Supports up to 5 million sandboxes in a single project.

Installation

pip install tensorlake

Setup

Sign up at cloud.tensorlake.ai and get your API key.

export TENSORLAKE_API_KEY="your-api-key"
tensorlake login

Create Your First Sandbox (CLI)

Create a sandbox, run a command, and clean up:

# Create a sandbox
tensorlake sbx create --image python:3.11-slim

# Run a command inside it
tensorlake sbx exec <sandbox-id> -- python -c "print('Hello from the sandbox!')"

# Copy a file into the sandbox
tensorlake sbx cp ./my_script.py <sandbox-id>:/tmp/my_script.py

# Open an interactive terminal
tensorlake sbx ssh <sandbox-id>

# Terminate when done
tensorlake sbx terminate <sandbox-id>

Create a Sandbox Programmatically

from tensorlake.sandbox import SandboxClient

client = SandboxClient.for_cloud(api_key="your-api-key")

# Create a sandbox and connect to it
with client.create_and_connect(image="python:3.11-slim") as sandbox:
    # Run a command
    result = sandbox.run("python", ["-c", "print('Hello from the sandbox!')"])
    print(result.stdout)  # "Hello from the sandbox!"

    # Write and read files
    sandbox.write_file("/tmp/data.txt", b"some data")
    content = sandbox.read_file("/tmp/data.txt")

    # Start a long-running process
    proc = sandbox.start_process("python", ["-m", "http.server", "8080"])
    print(proc.pid)

# Sandbox is automatically terminated when the context manager exits

Snapshots

Save the state of a sandbox and restore it later:

# Snapshot a running sandbox
snapshot = client.snapshot_and_wait(sandbox_id)

# Later, create a new sandbox from the snapshot
with client.create_and_connect(snapshot_id=snapshot.snapshot_id) as sandbox:
    # Picks up right where you left off
    result = sandbox.run("ls", ["/tmp"])
    print(result.stdout)

Sandbox Pools

Pre-warm containers for fast startup:

# Create a pool with warm containers
pool = client.create_pool(
    image="python:3.11-slim",
    warm_containers=3,
)

# Claim a sandbox instantly from the pool
resp = client.claim(pool.pool_id)
sandbox = client.connect(resp.sandbox_id)

Orchestrate

Create orchestration APIs on a distributed runtime with automatic scaling, fan-out capabilities and built-in tracking. The orchestration APIs can be invoked using HTTP requests or using the Python SDK.

Quickstart

Decorate your entrypoint with @application() and functions with @function(). Each function runs in its own isolated sandbox.

Example: City guide using OpenAI Agents with web search and code execution:

from agents import Agent, Runner
from agents.tool import WebSearchTool, function_tool
from tensorlake.applications import application, function, Image

# Define the image with necessary dependencies
FUNCTION_CONTAINER_IMAGE = Image(base_image="python:3.11-slim", name="city_guide_image").run(
    "pip install openai openai-agents"
)

@function_tool
@function(
    description="Gets the weather for a city using an OpenAI Agent with web search",
    secrets=["OPENAI_API_KEY"],
    image=FUNCTION_CONTAINER_IMAGE,
)
def get_weather_tool(city: str) -> str:
    """Uses an OpenAI Agent with WebSearchTool to find current weather."""
    agent = Agent(
        name="Weather Reporter",
        instructions="Use web search to find current weather in Fahrenheit for the city.",
        tools=[WebSearchTool()],  # Agent can search the web
    )
    result = Runner.run_sync(agent, f"City: {city}")
    return result.final_output.strip()

@application(tags={"type": "example", "use_case": "city_guide"})
@function(
    description="Creates a guide with temperature conversion using function_tool",
    secrets=["OPENAI_API_KEY"],
    image=FUNCTION_CONTAINER_IMAGE,
)
def city_guide_app(city: str) -> str:
    """Uses an OpenAI Agent with function_tool to run Python code for conversion."""

    @function_tool
    def convert_to_celsius_tool(python_code: str) -> float:
        """Converts Fahrenheit to Celsius - runs as Python code via Agent."""
        return float(eval(python_code))

    agent = Agent(
        name="Guide Creator",
        instructions="Using the appropriate tools, get the weather for the purposes of the guide. If the city uses Celsius, call convert_to_celsius_tool to convert the temperature, passing in the code needed to convert the temperature to Celsius. Create a friendly guide that references the temperature of the city in Celsius if the city typically uses Celsius, otherwise reference the temperature in Fahrenheit. Only reference Celsius or Farenheit, not both.",
        tools=[get_weather_tool, convert_to_celsius_tool],  # Agent can execute this Python function
    )
    result = Runner.run_sync(agent, f"City: {city}")
    return result.final_output.strip()

Deploy to Tensorlake

  1. Set your API keys:
export TENSORLAKE_API_KEY="your-api-key"
tl secrets set OPENAI_API_KEY "your-openai-key"
  1. Deploy:
tl deploy examples/readme_example/city_guide.py

Call via HTTP

# Invoke the application
curl https://api.tensorlake.ai/applications/city_guide_app \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  --json '"San Francisco"'
# Returns: {"request_id": "beae8736ece31ef9"}

# Get the result
curl https://api.tensorlake.ai/applications/city_guide_app/requests/{request_id}/output \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY"

# Stream results with SSE
curl https://api.tensorlake.ai/applications/city_guide_app \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  -H "Accept: text/event-stream" \
  --json '"San Francisco"'

Learn More

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensorlake-0.4.35.tar.gz (2.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tensorlake-0.4.35-py3-none-win_amd64.whl (14.0 MB view details)

Uploaded Python 3Windows x86-64

tensorlake-0.4.35-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.5 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ x86-64

tensorlake-0.4.35-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (13.1 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ ARM64

tensorlake-0.4.35-py3-none-macosx_11_0_arm64.whl (12.6 MB view details)

Uploaded Python 3macOS 11.0+ ARM64

File details

Details for the file tensorlake-0.4.35.tar.gz.

File metadata

  • Download URL: tensorlake-0.4.35.tar.gz
  • Upload date:
  • Size: 2.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tensorlake-0.4.35.tar.gz
Algorithm Hash digest
SHA256 6a0a94ba56dada764e6ce32eda3e4bd38aa0b15068aa13a4a92da29cfb7a151e
MD5 dcd729e1fc2abd232a1e7cdfc22ce0db
BLAKE2b-256 2de2aeca68b914493610bad605d88423e06b7ad6c3b8b12caebc817768a73de4

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.35.tar.gz:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.35-py3-none-win_amd64.whl.

File metadata

  • Download URL: tensorlake-0.4.35-py3-none-win_amd64.whl
  • Upload date:
  • Size: 14.0 MB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tensorlake-0.4.35-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 9e2a5165f7bee797683d355bdd92111d83fcbff30bd4065c43ac01b1daa7bc89
MD5 b4101f4e7bc8c4559907b36870b0ffd9
BLAKE2b-256 4af7c3bc565ccbbf6c3ccc8a6dfde36c15c42bb4927b3f4ebc2a52f55217c363

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.35-py3-none-win_amd64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.35-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.35-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b56e7ea296a108d6944a5be513057c17f2759ab27ae64a23711e03743d7daed6
MD5 db6a4cf1217d931aa63cc54de0d7175a
BLAKE2b-256 51e50f0c6ac08853dfc7768051dcd3674af12e383da04d68764199c48ad1ee8e

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.35-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.35-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.35-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 201867184c7e255434f52447940583213b6a6297a46428320d3767c4c543c5dc
MD5 c24d7e9470b1a657fdf6be6bcc568d36
BLAKE2b-256 65e88d4a1f85d9b924ab4c1e8aca8c748fccd7e944e3d2cc858aefab735e464d

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.35-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.35-py3-none-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.35-py3-none-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 602be5477c46a412e9387a5d8567242f367636380e4e2ee4dc9dd4aabc8f927d
MD5 673474a4f7e08c3c891af432a840ecd8
BLAKE2b-256 365a9956ffe3fe8e0f16205d1adc682492da88dff8b38bdab3d4e21487ca3b9b

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.35-py3-none-macosx_11_0_arm64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page