Skip to main content

Gravix Layer Python SDK — agent runtimes and templates (Alpha; API may evolve). See docs.gravixlayer.ai.

Project description

Gravix Layer Python SDK

PyPI version Python 3.9+ License: Apache 2.0

Official Python client for Gravix Layer — create and manage cloud agent runtimes and templates for your workloads.


New to this SDK?

Step Action
1 Install: pip install gravixlayer
2 Set API key: export GRAVIXLAYER_API_KEY="your-api-key" (from the Gravix Layer console)
3 Run the quick start below, or open examples/ for runnable scripts (runtimes, templates)

Docs: docs.gravixlayer.ai · Examples index: examples/README.md


Install

pip install gravixlayer

Configure

export GRAVIXLAYER_API_KEY="your-api-key"
export GRAVIXLAYER_CLOUD="azure"       # default
export GRAVIXLAYER_REGION="eastus2"    # default

Or pass options to the client:

from gravixlayer import GravixLayer

client = GravixLayer(
    api_key="your-api-key",
    base_url="https://api.gravixlayer.ai",
    cloud="azure",
    region="eastus2",
)

Quick start

client.runtime.create(...) returns a bound Runtime handle, so you can call runtime.run_code(...), runtime.run_cmd(...), runtime.file.*, runtime.git.*, runtime.kill() directly — no need to pass runtime_id to every call.

from gravixlayer import GravixLayer

client = GravixLayer()
runtime = client.runtime.create(template="python-3.14-base-small")

# Run Python code
result = runtime.run_code(code="print('Hello from Gravix Layer')")
print(result.text)

# Run a shell command — two equivalent forms:
runtime.run_cmd(command="pip install pandas --quiet")               # single string
runtime.run_cmd(command="pip", args=["install", "pandas", "--quiet"])  # command + args

runtime.kill()

runtime.run_cmd(command=...) accepts either a single shell string (auto-wrapped in /bin/sh -c when it contains shell metacharacters like spaces, ;, |, >, <, &, $, backticks) or a command + explicit args list (no shell interpretation).

File operations

Use runtime.file for filesystem ops on the bound handle: read, write, delete, list, upload (multipart bytes), write_many (batch multipart), plus create_directory, get_info, set_permissions. The same methods are also available resource-style via client.runtime.file.*(runtime_id, …).

runtime = client.runtime.create(template="python-3.14-base-small")
runtime.file.write("/workspace/note.txt", "hello\n")
text = runtime.file.read("/workspace/out.txt").content
runtime.kill()

See examples/runtimes/07_file_operations.py for a full walkthrough.

Examples (runnable)

Area What you’ll learn
examples/runtimes/ Create runtimes, run code & shell, files, metrics, SSH, context manager, Git — 16 scripts
examples/templates/ Build custom templates (Docker image, Git, Dockerfile) — 6 scripts

Start here: examples/README.md (task table + quick reference).

Performance note (connections and HTTP/2)

The client uses HTTP/1.1 by default for predictable latency on typical API usage.

  • Warm the connection before creating many runtimes: call client.warmup() once (or use warmup_on_init=True when constructing the client). That pays TCP, TLS, and protocol setup up front so the first real request is cheaper.
  • HTTP/2: pass http2=True to the GravixLayer client constructor if you want multiplexing over a single established connection after TLS (useful for high concurrency). Requires the httpx[http2] extra (already declared by this package).

Sync:

from gravixlayer import GravixLayer

client = GravixLayer(http2=True)
client.warmup()
# or: GravixLayer(http2=True, warmup_on_init=True)

Async: pass http2=True to AsyncGravixLayer and call await client.warmup() before heavy traffic.

from gravixlayer import AsyncGravixLayer

async with AsyncGravixLayer(http2=True) as client:
    await client.warmup()

Async

import asyncio
from gravixlayer import AsyncGravixLayer

async def main():
    async with AsyncGravixLayer() as client:
        runtime = await client.runtime.create(template="python-3.14-base-small")
        await client.runtime.kill(runtime.runtime_id)

asyncio.run(main())

Development

For local development and CI, run the unit tests (HTTP mocked; no API key required):

pip install -e ".[test]"
pytest tests/unit_tests

Test layout (tests/unit_tests vs tests/integration_tests), integration runs, and markers are documented in tests/README.md so this file stays focused on SDK usage.

Documentation and support

support@gravixlayer.ai

Feedback: gravixlayer/gravixlayer-feedback — bugs, features, and product feedback.

License

Apache License 2.0 — see LICENSE.
Copyright 2026 Gravix Layer.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gravixlayer-0.1.46.tar.gz (117.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gravixlayer-0.1.46-py3-none-any.whl (83.8 kB view details)

Uploaded Python 3

File details

Details for the file gravixlayer-0.1.46.tar.gz.

File metadata

  • Download URL: gravixlayer-0.1.46.tar.gz
  • Upload date:
  • Size: 117.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for gravixlayer-0.1.46.tar.gz
Algorithm Hash digest
SHA256 8ae69acfc4247d373042ed833506a58da19ef4e79ed3c9920d95809b74b8976c
MD5 50e54c332b9aa7532898131756a6cfb8
BLAKE2b-256 1a63ea51bb7141cc5f969c315d8825deb88b9b99a4f61141cea0e19837838299

See more details on using hashes here.

File details

Details for the file gravixlayer-0.1.46-py3-none-any.whl.

File metadata

  • Download URL: gravixlayer-0.1.46-py3-none-any.whl
  • Upload date:
  • Size: 83.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for gravixlayer-0.1.46-py3-none-any.whl
Algorithm Hash digest
SHA256 108cc83654663d58f7237fd7a7a1c51d14eaf7f5f3db8093135d7cf272659c06
MD5 051806699b6391dac62b8bee5c1bcf2b
BLAKE2b-256 9f11996384864ec55ba8da57688df120cbc7b9470083190bac709e66f926d1ac

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page