Skip to main content

Python SDK for Joblet - A distributed task execution system

Project description

Joblet Python SDK

The official Python SDK for Joblet - a distributed job orchestration system with GPU support.

Installation

pip install joblet-sdk-python

Quick Start

from joblet import JobletClient

# Connect to your Joblet server
with JobletClient(
    host="your-joblet-server.com",
    port=50051,
    ca_cert_path="ca.pem",
    client_cert_path="client.pem",
    client_key_path="client.key"
) as client:
    # Run a simple job
    job = client.jobs.run_job(
        command="echo",
        args=["Hello, Joblet!"],
        name="my-first-job"
    )
    print(f"Job started: {job['job_uuid']}")

Configuration

Create ~/.rnx/rnx-config.yml:

version: "3.0"
nodes:
  default:
    address: "your-joblet-server:50051"
    cert: |
      -----BEGIN CERTIFICATE-----
      # Your client certificate
      -----END CERTIFICATE-----
    key: |
      -----BEGIN PRIVATE KEY-----
      # Your client private key
      -----END PRIVATE KEY-----
    ca: |
      -----BEGIN CERTIFICATE-----
      # Your CA certificate
      -----END CERTIFICATE-----

GPU Support

# Run GPU-accelerated job
job = client.jobs.run_job(
    command="nvidia-smi",
    name="gpu-job",
    gpu_count=1,
    gpu_memory_mb=4096,
    runtime="python-3.11-ml"
)

What You Can Do

Run Jobs Anywhere

# Run compute-intensive tasks on remote servers
job = client.jobs.run_job(
    command="python",
    args=["train_model.py"],
    max_cpu=800,  # 8 cores
    max_memory=16384,  # 16GB
    gpu_count=2
)

Stream Logs in Real-Time

# Get complete logs from any job (running or completed)
for chunk in client.jobs.get_job_logs(job['job_uuid']):
    print(chunk.decode('utf-8'), end='', flush=True)

Query Historical Data

# Analyze past job performance
for metric in client.persist.query_metrics(job_id=job_uuid):
    print(f"CPU: {metric['data']['cpu_usage']:.2f}%")
    print(f"Memory: {metric['data']['memory_usage'] / 1e9:.2f} GB")

Build Workflows

# Chain multiple jobs with dependencies
workflow = client.jobs.run_workflow(
    workflow="data-pipeline.yml",
    yaml_content="""
    jobs:
      preprocess:
        command: python preprocess.py
      train:
        command: python train.py
        depends_on: [preprocess]
      evaluate:
        command: python evaluate.py
        depends_on: [train]
    """
)

Manage Resources

# Create isolated networks and persistent storage
network = client.networks.create_network("ml-net", "10.0.1.0/24")
volume = client.volumes.create_volume("data", "100GB")

# Use in jobs
job = client.jobs.run_job(
    command="python",
    args=["process_data.py"],
    network="ml-net",
    volumes=["data:/data"]
)

Monitor System Health

# Get real-time system metrics
for metrics in client.monitoring.stream_system_metrics(interval_seconds=5):
    cpu = metrics['cpu']['usage_percent']
    memory = metrics['memory']['usage_percent']
    print(f"System: CPU {cpu:.1f}%, Memory {memory:.1f}%")

API Reference

Jobs

  • client.jobs.run_job() - Execute a job
  • client.jobs.cancel_job() - Cancel a scheduled job
  • client.jobs.stop_job() - Stop a running job
  • client.jobs.get_job_status() - Get job status
  • client.jobs.get_job_logs() - Smart log streaming (historical + live)
  • client.jobs.stream_live_logs() - Live-only log streaming
  • client.jobs.run_workflow() - Execute a workflow

Historical Data

  • client.persist.query_logs() - Query historical logs with filtering
  • client.persist.query_metrics() - Query historical metrics data

Resources

  • client.networks - Network management
  • client.volumes - Storage management
  • client.monitoring - System monitoring
  • client.runtimes - Runtime environments

For complete API documentation, see docs/API_REFERENCE.md

Development

Setup

# Clone and setup
git clone https://github.com/ehsaniara/joblet-sdk-python.git
cd joblet-sdk-python

# Install development dependencies (editable mode)
make dev

# Or manually:
pip install -e .[dev]
pre-commit install

Testing

# Run tests with coverage
make test

# Run linting (exactly what CI runs)
make lint

# IMPORTANT: Test package installation before release (CI-like)
make test-package

Why make test-package is Important

Problem: Editable installs (pip install -e .) can mask packaging issues. Your local tests may pass but CI/production installs may fail.

Solution: Before committing or releasing, run:

make test-package

This command:

  1. Uninstalls the editable version
  2. Builds a clean package
  3. Installs it like CI and end-users will
  4. Runs all tests against the installed package
  5. Catches issues like missing __init__.py, incorrect package structure, etc.

After testing, restore editable install:

pip install -e .[dev]

Other Commands

# Build distribution packages
make build

# Regenerate protobuf files
make proto

# Clean build artifacts
make clean

Examples

See the examples/ directory for more detailed usage examples:

  • basic_job.py - Simple job execution
  • gpu_example.py - GPU-accelerated workloads
  • workflow_example.py - Complex workflows

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

joblet_sdk_python-2.0.0.tar.gz (85.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

joblet_sdk_python-2.0.0-py3-none-any.whl (73.4 kB view details)

Uploaded Python 3

File details

Details for the file joblet_sdk_python-2.0.0.tar.gz.

File metadata

  • Download URL: joblet_sdk_python-2.0.0.tar.gz
  • Upload date:
  • Size: 85.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for joblet_sdk_python-2.0.0.tar.gz
Algorithm Hash digest
SHA256 aa7b30ec4afb6ebc9395bae3e1fb13dfc2350a0bdb5fd5e912826756c466cf2a
MD5 e1d2e0fe1aa9edd13fb75d48bd4a61c3
BLAKE2b-256 6dfcfb1b3c703b6bcf4744c376379b519c1d6d8d3345ba1996fc833846ddb1b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for joblet_sdk_python-2.0.0.tar.gz:

Publisher: release.yml on ehsaniara/joblet-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file joblet_sdk_python-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for joblet_sdk_python-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d50b7d179db546ef68bbb63fefe50e718b85e413e4c3b927f360b406e8237f1f
MD5 522f116cbd3f0090f5bca0074946d123
BLAKE2b-256 50cab31babdbb984a8f63b28ba0a0f3eed787dc280a44caef3201d4c95f3d7c1

See more details on using hashes here.

Provenance

The following attestation bundles were made for joblet_sdk_python-2.0.0-py3-none-any.whl:

Publisher: release.yml on ehsaniara/joblet-sdk-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page