Tensorlake SDK for Document Ingestion API and Serverless Applications
Project description
Build agents with sandboxes and serverless orchestration runtime
Tensorlake is a compute infrastructure platform for building agentic applications with sandboxes.
The Sandbox API creates MicroVM sandboxes which you can use to run agents, or use them as an isolated environment for running tools or LLM generated code.
In addition to stateful VMs, you can also add long running orchestration capabilites to Agents using a serverless funtion runtime with fan-out capabilities.
Sandboxes
Tensorlake Sandboxes are stateful Firecracker MicroVMs built for instant, stateful execution environments for AI agents — spin up millions of VMs with near-SSD filesystem performance.
Key capabilities
- Fastest Filesystem I/O — Block-based storage achieving near-SSD speeds inside virtual machines. In SQLite benchmarks (2 vCPUs, 4 GB RAM), Tensorlake completes in 2.45s vs Vercel 3.00s (1.2×), E2B 3.92s (1.6×), Modal 4.66s (1.9×), and Daytona 5.51s (2.2×).
- Fast startup — Sandboxes created in under a second via Lattice, a dynamic cluster scheduler.
- Snapshots & cloning — Snapshot at any point to create durable memory and filesystem checkpoints; clone running sandboxes instantaneously across machines.
- Auto suspend/resume — Sandboxes suspend when idle and resume in under a second without losing any memory or filesystem state.
- Live migration — Sandboxes automatically move between machines during updates with only a brief pause of a few seconds.
- Scale — Supports up to 5 million sandboxes in a single project.
Installation
pip install tensorlake
Setup
Sign up at cloud.tensorlake.ai and get your API key.
export TENSORLAKE_API_KEY="your-api-key"
tensorlake login
Create Your First Sandbox (CLI)
Create a sandbox, run a command, and clean up:
# Create a sandbox
tensorlake sbx create --image tensorlake/tensorlake/ubuntu-minimal
# Run a command inside it
tensorlake sbx exec <sandbox-id> -- sh -lc "printf 'Hello from the sandbox!\n'"
# Copy a file into the sandbox
tensorlake sbx cp ./my_script.py <sandbox-id>:/tmp/my_script.py
# Open an interactive terminal
tensorlake sbx ssh <sandbox-id>
# Terminate when done
tensorlake sbx terminate <sandbox-id>
--image expects a sandbox image name such as tensorlake/ubuntu-minimal or a registered Sandbox Image name, not an arbitrary Docker image reference.
Create a Sandbox Programmatically
from tensorlake.sandbox import SandboxClient
client = SandboxClient.for_cloud(api_key="your-api-key")
# Create a sandbox and connect to it
with client.create_and_connect(image="tensorlake/ubuntu-minimal") as sandbox:
# Run a command
result = sandbox.run("sh", ["-lc", "printf 'Hello from the sandbox!\\n'"])
print(result.stdout) # "Hello from the sandbox!"
# Write and read files
sandbox.write_file("/tmp/data.txt", b"some data")
content = sandbox.read_file("/tmp/data.txt")
# Start a long-running process
proc = sandbox.start_process("sleep", ["300"])
print(proc.pid)
# Sandbox is automatically terminated when the context manager exits
Snapshots
Save the state of a sandbox and restore it later:
# Snapshot a running sandbox
snapshot = client.snapshot_and_wait(sandbox_id)
# Later, create a new sandbox from the snapshot
with client.create_and_connect(snapshot_id=snapshot.snapshot_id) as sandbox:
# Picks up right where you left off
result = sandbox.run("ls", ["/tmp"])
print(result.stdout)
Sandbox Pools
Pre-warm containers for fast startup:
# Create a pool with warm containers
pool = client.create_pool(
image="tensorlake/ubuntu-minimal",
warm_containers=3,
)
# Claim a sandbox instantly from the pool
resp = client.claim(pool.pool_id)
sandbox = client.connect(resp.sandbox_id)
# Named sandboxes can be reconnected later by name
named = client.create(image="tensorlake/ubuntu-minimal", name="stable-name")
sandbox = client.connect("stable-name")
Orchestrate
Create orchestration APIs on a distributed runtime with automatic scaling, fan-out capabilities and built-in tracking. The orchestration APIs can be invoked using HTTP requests or using the Python SDK.
Quickstart
Decorate your entrypoint with @application() and functions with @function(). Each function runs in its own isolated sandbox.
Example: City guide using OpenAI Agents with web search and code execution:
from agents import Agent, Runner
from agents.tool import WebSearchTool, function_tool
from tensorlake.applications import application, function, Image
# Define the image with necessary dependencies
FUNCTION_CONTAINER_IMAGE = Image(base_image="python:3.11-slim", name="city_guide_image").run(
"pip install openai openai-agents"
)
@function_tool
@function(
description="Gets the weather for a city using an OpenAI Agent with web search",
secrets=["OPENAI_API_KEY"],
image=FUNCTION_CONTAINER_IMAGE,
)
def get_weather_tool(city: str) -> str:
"""Uses an OpenAI Agent with WebSearchTool to find current weather."""
agent = Agent(
name="Weather Reporter",
instructions="Use web search to find current weather in Fahrenheit for the city.",
tools=[WebSearchTool()], # Agent can search the web
)
result = Runner.run_sync(agent, f"City: {city}")
return result.final_output.strip()
@application(tags={"type": "example", "use_case": "city_guide"})
@function(
description="Creates a guide with temperature conversion using function_tool",
secrets=["OPENAI_API_KEY"],
image=FUNCTION_CONTAINER_IMAGE,
)
def city_guide_app(city: str) -> str:
"""Uses an OpenAI Agent with function_tool to run Python code for conversion."""
@function_tool
def convert_to_celsius_tool(python_code: str) -> float:
"""Converts Fahrenheit to Celsius - runs as Python code via Agent."""
return float(eval(python_code))
agent = Agent(
name="Guide Creator",
instructions="Using the appropriate tools, get the weather for the purposes of the guide. If the city uses Celsius, call convert_to_celsius_tool to convert the temperature, passing in the code needed to convert the temperature to Celsius. Create a friendly guide that references the temperature of the city in Celsius if the city typically uses Celsius, otherwise reference the temperature in Fahrenheit. Only reference Celsius or Farenheit, not both.",
tools=[get_weather_tool, convert_to_celsius_tool], # Agent can execute this Python function
)
result = Runner.run_sync(agent, f"City: {city}")
return result.final_output.strip()
Deploy to Tensorlake
- Set your API keys:
export TENSORLAKE_API_KEY="your-api-key"
tl secrets set OPENAI_API_KEY "your-openai-key"
- Deploy:
tl deploy examples/readme_example/city_guide.py
Call via HTTP
# Invoke the application
curl https://api.tensorlake.ai/applications/city_guide_app \
-H "Authorization: Bearer $TENSORLAKE_API_KEY" \
--json '"San Francisco"'
# Returns: {"request_id": "beae8736ece31ef9"}
# Get the result
curl https://api.tensorlake.ai/applications/city_guide_app/requests/{request_id}/output \
-H "Authorization: Bearer $TENSORLAKE_API_KEY"
# Stream results with SSE
curl https://api.tensorlake.ai/applications/city_guide_app \
-H "Authorization: Bearer $TENSORLAKE_API_KEY" \
-H "Accept: text/event-stream" \
--json '"San Francisco"'
Learn More
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tensorlake-0.4.46.tar.gz.
File metadata
- Download URL: tensorlake-0.4.46.tar.gz
- Upload date:
- Size: 2.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
da6859a27a41b8e8179a33cf98cf95e8900067a43c0246a99b556612348e82f4
|
|
| MD5 |
944763596bf2a276cd158fd6bbd08a27
|
|
| BLAKE2b-256 |
ebdd8a0f3a4a739ab488ade84bedeb71971a80db23602d9294000a10a2149702
|
Provenance
The following attestation bundles were made for tensorlake-0.4.46.tar.gz:
Publisher:
publish_pypi.yaml on tensorlakeai/tensorlake
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tensorlake-0.4.46.tar.gz -
Subject digest:
da6859a27a41b8e8179a33cf98cf95e8900067a43c0246a99b556612348e82f4 - Sigstore transparency entry: 1310682906
- Sigstore integration time:
-
Permalink:
tensorlakeai/tensorlake@157ad750b69fff52824d9f8f313d526e6e511791 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/tensorlakeai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_pypi.yaml@157ad750b69fff52824d9f8f313d526e6e511791 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file tensorlake-0.4.46-py3-none-win_amd64.whl.
File metadata
- Download URL: tensorlake-0.4.46-py3-none-win_amd64.whl
- Upload date:
- Size: 14.3 MB
- Tags: Python 3, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3bf2dd6e8f54f49b980fdc2403c4f9f9a026c0f39c8c5813f0b2945b013b1bc5
|
|
| MD5 |
2f5b7cc0148afb2154ddbeaa38b90eb7
|
|
| BLAKE2b-256 |
3cfbdd91f6f8e4f785f35e5f7493b6d86f57a78e447420e4eb88e39295365f3d
|
Provenance
The following attestation bundles were made for tensorlake-0.4.46-py3-none-win_amd64.whl:
Publisher:
publish_pypi.yaml on tensorlakeai/tensorlake
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tensorlake-0.4.46-py3-none-win_amd64.whl -
Subject digest:
3bf2dd6e8f54f49b980fdc2403c4f9f9a026c0f39c8c5813f0b2945b013b1bc5 - Sigstore transparency entry: 1310683168
- Sigstore integration time:
-
Permalink:
tensorlakeai/tensorlake@157ad750b69fff52824d9f8f313d526e6e511791 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/tensorlakeai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_pypi.yaml@157ad750b69fff52824d9f8f313d526e6e511791 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file tensorlake-0.4.46-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: tensorlake-0.4.46-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 13.8 MB
- Tags: Python 3, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
909c1d8430ba32903d5d38b5b7271140d50b77bcc6a45c06ebd4bc43ec60b1a7
|
|
| MD5 |
76ea9aaee722eae0607e2a330ede1e38
|
|
| BLAKE2b-256 |
b0945bbf4a3114522ef15d5a253ba186017fb4df987d77d51296e2953825371e
|
Provenance
The following attestation bundles were made for tensorlake-0.4.46-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
publish_pypi.yaml on tensorlakeai/tensorlake
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tensorlake-0.4.46-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
909c1d8430ba32903d5d38b5b7271140d50b77bcc6a45c06ebd4bc43ec60b1a7 - Sigstore transparency entry: 1310683101
- Sigstore integration time:
-
Permalink:
tensorlakeai/tensorlake@157ad750b69fff52824d9f8f313d526e6e511791 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/tensorlakeai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_pypi.yaml@157ad750b69fff52824d9f8f313d526e6e511791 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file tensorlake-0.4.46-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: tensorlake-0.4.46-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 13.3 MB
- Tags: Python 3, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c917670643c31a1bd6844d5a4e15b3bf55e1afee7df789032dba2ac2d4b25dd3
|
|
| MD5 |
5fa938d65b34e4f697ab7fb12b703232
|
|
| BLAKE2b-256 |
33f336e07e041c8bf5fb21a2f081353390ed2b2a065f3af22f94467a437fee9c
|
Provenance
The following attestation bundles were made for tensorlake-0.4.46-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:
Publisher:
publish_pypi.yaml on tensorlakeai/tensorlake
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tensorlake-0.4.46-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl -
Subject digest:
c917670643c31a1bd6844d5a4e15b3bf55e1afee7df789032dba2ac2d4b25dd3 - Sigstore transparency entry: 1310683030
- Sigstore integration time:
-
Permalink:
tensorlakeai/tensorlake@157ad750b69fff52824d9f8f313d526e6e511791 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/tensorlakeai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_pypi.yaml@157ad750b69fff52824d9f8f313d526e6e511791 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file tensorlake-0.4.46-py3-none-macosx_11_0_arm64.whl.
File metadata
- Download URL: tensorlake-0.4.46-py3-none-macosx_11_0_arm64.whl
- Upload date:
- Size: 12.8 MB
- Tags: Python 3, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f560761d345dcb2fa3caae3028a0c26d9e55c6a2a9447d88a5ddb656155f2b0c
|
|
| MD5 |
a718418275aba3304df939dfd242f815
|
|
| BLAKE2b-256 |
9f86f92e200e62fc189abdeca67e9b2322986e955321e898a623229d982cea4e
|
Provenance
The following attestation bundles were made for tensorlake-0.4.46-py3-none-macosx_11_0_arm64.whl:
Publisher:
publish_pypi.yaml on tensorlakeai/tensorlake
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tensorlake-0.4.46-py3-none-macosx_11_0_arm64.whl -
Subject digest:
f560761d345dcb2fa3caae3028a0c26d9e55c6a2a9447d88a5ddb656155f2b0c - Sigstore transparency entry: 1310682970
- Sigstore integration time:
-
Permalink:
tensorlakeai/tensorlake@157ad750b69fff52824d9f8f313d526e6e511791 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/tensorlakeai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_pypi.yaml@157ad750b69fff52824d9f8f313d526e6e511791 -
Trigger Event:
workflow_dispatch
-
Statement type: