Skip to main content

Tensorlake SDK for Document Ingestion API and Serverless Applications

Project description

Group 39884

Secure cloud sandboxes and durable serverless applications for AI agents

PyPI Version Python Support License Documentation Slack

Products

  • Sandboxes — Secure, isolated cloud environments for running code. Spin up a sandbox in seconds, execute commands, transfer files, and manage processes — from the CLI or Python SDK.

  • Applications — Deploy durable, serverless agentic applications and workflows with automatic scaling and fault tolerance.


Sandboxes

Sandboxes are secure, isolated cloud environments for running arbitrary code. Each sandbox is a lightweight container with its own filesystem, network, and process space. Use them to give your AI agents a safe place to execute code, run tools, or interact with the outside world.

Installation

pip install tensorlake

Setup

Sign up at cloud.tensorlake.ai and get your API key.

export TENSORLAKE_API_KEY="your-api-key"
tensorlake login

Create Your First Sandbox (CLI)

Create a sandbox, run a command, and clean up:

# Create a sandbox
tensorlake sbx create --image python:3.11-slim

# Run a command inside it
tensorlake sbx exec <sandbox-id> -- python -c "print('Hello from the sandbox!')"

# Copy a file into the sandbox
tensorlake sbx cp ./my_script.py <sandbox-id>:/tmp/my_script.py

# Open an interactive terminal
tensorlake sbx ssh <sandbox-id>

# Terminate when done
tensorlake sbx terminate <sandbox-id>

Create a Sandbox Programmatically (Python SDK)

from tensorlake.sandbox import SandboxClient

client = SandboxClient.for_cloud(api_key="your-api-key")

# Create a sandbox and connect to it
with client.create_and_connect(image="python:3.11-slim") as sandbox:
    # Run a command
    result = sandbox.run("python", ["-c", "print('Hello from the sandbox!')"])
    print(result.stdout)  # "Hello from the sandbox!"

    # Write and read files
    sandbox.write_file("/tmp/data.txt", b"some data")
    content = sandbox.read_file("/tmp/data.txt")

    # Start a long-running process
    proc = sandbox.start_process("python", ["-m", "http.server", "8080"])
    print(proc.pid)

# Sandbox is automatically terminated when the context manager exits

Snapshots

Save the state of a sandbox and restore it later:

# Snapshot a running sandbox
snapshot = client.snapshot_and_wait(sandbox_id)

# Later, create a new sandbox from the snapshot
with client.create_and_connect(snapshot_id=snapshot.snapshot_id) as sandbox:
    # Picks up right where you left off
    result = sandbox.run("ls", ["/tmp"])
    print(result.stdout)

Sandbox Pools

Pre-warm containers for fast startup:

# Create a pool with warm containers
pool = client.create_pool(
    image="python:3.11-slim",
    warm_containers=3,
)

# Claim a sandbox instantly from the pool
resp = client.claim(pool.pool_id)
sandbox = client.connect(resp.sandbox_id)

Applications

Deploy agentic applications on a distributed runtime with automatic scaling and durable execution — applications restart from where they crashed automatically. You can build with any Python framework. Agents are exposed as HTTP APIs like web applications.

  • No Queues: We manage state and orchestration
  • Zero Infra: Write Python, deploy to Tensorlake
  • Progress Updates: Applications can run for any amount of time and stream updates to users.

Quickstart

Decorate your entrypoint with @application() and functions with @function() for checkpointing and sandboxed execution. Each function runs in its own isolated sandbox.

Example: City guide using OpenAI Agents with web search and code execution:

from agents import Agent, Runner
from agents.tool import WebSearchTool, function_tool
from tensorlake.applications import application, function, Image

# Define the image with necessary dependencies
FUNCTION_CONTAINER_IMAGE = Image(base_image="python:3.11-slim", name="city_guide_image").run(
    "pip install openai openai-agents"
)

@function_tool
@function(
    description="Gets the weather for a city using an OpenAI Agent with web search",
    secrets=["OPENAI_API_KEY"],
    image=FUNCTION_CONTAINER_IMAGE,
)
def get_weather_tool(city: str) -> str:
    """Uses an OpenAI Agent with WebSearchTool to find current weather."""
    agent = Agent(
        name="Weather Reporter",
        instructions="Use web search to find current weather in Fahrenheit for the city.",
        tools=[WebSearchTool()],  # Agent can search the web
    )
    result = Runner.run_sync(agent, f"City: {city}")
    return result.final_output.strip()

@application(tags={"type": "example", "use_case": "city_guide"})
@function(
    description="Creates a guide with temperature conversion using function_tool",
    secrets=["OPENAI_API_KEY"],
    image=FUNCTION_CONTAINER_IMAGE,
)
def city_guide_app(city: str) -> str:
    """Uses an OpenAI Agent with function_tool to run Python code for conversion."""

    @function_tool
    def convert_to_celsius_tool(python_code: str) -> float:
        """Converts Fahrenheit to Celsius - runs as Python code via Agent."""
        return float(eval(python_code))

    agent = Agent(
        name="Guide Creator",
        instructions="Using the appropriate tools, get the weather for the purposes of the guide. If the city uses Celsius, call convert_to_celsius_tool to convert the temperature, passing in the code needed to convert the temperature to Celsius. Create a friendly guide that references the temperature of the city in Celsius if the city typically uses Celsius, otherwise reference the temperature in Fahrenheit. Only reference Celsius or Farenheit, not both.",
        tools=[get_weather_tool, convert_to_celsius_tool],  # Agent can execute this Python function
    )
    result = Runner.run_sync(agent, f"City: {city}")
    return result.final_output.strip()

Note: This is a simplified version. See the complete example at examples/readme_example/city_guide.py for the full implementation including activity suggestions and agent orchestration.

Deploy to Tensorlake Cloud

  1. Set your API keys:
export TENSORLAKE_API_KEY="your-api-key"
tensorlake secrets set OPENAI_API_KEY "your-openai-key"
  1. Deploy:
tensorlake deploy examples/readme_example/city_guide.py

Call via HTTP

# Invoke the application
curl https://api.tensorlake.ai/applications/city_guide_app \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  --json '"San Francisco"'
# Returns: {"request_id": "beae8736ece31ef9"}

# Get the result
curl https://api.tensorlake.ai/applications/city_guide_app/requests/{request_id}/output \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY"

# Stream results with SSE
curl https://api.tensorlake.ai/applications/city_guide_app \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  -H "Accept: text/event-stream" \
  --json '"San Francisco"'

# Send files
curl https://api.tensorlake.ai/applications/my_pdf_processor \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  -H "Content-Type: application/pdf" \
  --data-binary @document.pdf

Learn More

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensorlake-0.4.23.tar.gz (2.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tensorlake-0.4.23-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.0 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ x86-64

tensorlake-0.4.23-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (10.6 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ ARM64

tensorlake-0.4.23-py3-none-macosx_11_0_arm64.whl (10.2 MB view details)

Uploaded Python 3macOS 11.0+ ARM64

tensorlake-0.4.23-py3-none-macosx_10_12_x86_64.whl (10.7 MB view details)

Uploaded Python 3macOS 10.12+ x86-64

File details

Details for the file tensorlake-0.4.23.tar.gz.

File metadata

  • Download URL: tensorlake-0.4.23.tar.gz
  • Upload date:
  • Size: 2.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tensorlake-0.4.23.tar.gz
Algorithm Hash digest
SHA256 878c34f9d4fc66f00884977c66ef83719f5eb15b8091095a5476a661d94bf5f9
MD5 3568b10367d8d22503a267040cc45fc2
BLAKE2b-256 2e8be3c4054d83994964bf147905e4ac13afa70f63c51182d664948acf63298e

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.23.tar.gz:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.23-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.23-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 548f33af094ffbac85fa98a7305583fb8e30147078431f3f768e522c350c6208
MD5 a65b5482b7a21578bda15429f03030be
BLAKE2b-256 5dcdfd9b08c1f0d042dba6c74fb4b71a53f3aa4f91f0d89df2ec915159ad00db

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.23-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.23-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.23-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 a871779f94e45155438b29ac65714ddf01e1f09040679f81f9f8b9a4c462495c
MD5 f05fc1ea960956c44d74bac9c77e35ce
BLAKE2b-256 cd894d53cd8b4a0bc8f35b55d3b3150164b53003bc57167732fcc5116e7d7b2a

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.23-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.23-py3-none-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.23-py3-none-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 b251b4dc2a926010b6d986d7bd89297394137e15ae035a973dee046c7edc2b96
MD5 9c40675068a6ab00ea0b83f0fa6b9b32
BLAKE2b-256 9ba2fa5d72058451735a5365b2461c946f6d454a272e4d0a0878ffe8db505560

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.23-py3-none-macosx_11_0_arm64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tensorlake-0.4.23-py3-none-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tensorlake-0.4.23-py3-none-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 e3ee4b0e2d8e33b9ff4bc9902e1a865bfd1ec791df2d7b85157b392dc808d530
MD5 10d6cbbf36654fa3585539d2b2d96d23
BLAKE2b-256 805a33ad3df5aa138e08ea35a98d6232a6b3eb3a586fb35836892d945191ecb7

See more details on using hashes here.

Provenance

The following attestation bundles were made for tensorlake-0.4.23-py3-none-macosx_10_12_x86_64.whl:

Publisher: publish_pypi.yaml on tensorlakeai/tensorlake

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page