Skip to main content

Minimal ML experiment platform wrapping Azure ML, Vertex AI, and Vercel Sandbox

Project description

NUCL

Minimal ML experiment platform. 2,000 lines of code.

NUCL wraps Azure ML, Vertex AI, and Vercel Sandbox behind a unified CLI and web dashboard. No servers to manage, no databases to maintain, no collectors to deploy. Every feature delegates to a managed service.

nucl run --name "vision/resnet-50-v2" --script train.py --gpu-type t4
nucl ps vision/
nucl log vision/resnet-50-v2 -f
nucl pull vision/resnet-50-v2

Architecture

graph LR
    subgraph Clients
        CLI["CLI (Python)"]
        Web["Web Dashboard"]
    end

    subgraph Vercel
        API["Next.js API Routes"]
        Auth["Clerk (auth + API keys)"]
    end

    subgraph Sandbox["Vercel Sandbox (Python 3.13)"]
        AzureSDK["azure-ai-ml SDK"]
        VertexSDK["vertex AI SDK"]
    end

    subgraph Platforms
        AzureML["Azure ML"]
        VertexAI["Vertex AI"]
        SandboxRun["Sandbox (CPU)"]
    end

    CLI -- HTTP --> API
    Web -- HTTP --> API
    API --> Auth
    API -- "job submission" --> Sandbox
    API -. "read ops (list, logs, cancel)" .-> Platforms
    AzureSDK --> AzureML
    VertexSDK --> VertexAI
    Sandbox --> SandboxRun

Job submission spins up a short-lived Vercel Sandbox with Python 3.13 and uses the official cloud SDKs to upload code and create training jobs. Read operations (list, logs, cancel) use direct REST API calls.

Three platforms:

Platform GPU Use case
Azure ML Yes Production training on Azure
Vertex AI Yes Production training on GCP
Sandbox No (CPU) Quick tests, no cloud account needed

Getting Started

Install the CLI

uv tool install nucl
nucl --help

For users: join a team and run jobs

Your team admin will have already configured cloud credentials. You just need to log in and start running jobs.

# 1. Log in (opens browser)
nucl auth login

# 2. See your teams and pick one
nucl team list
nucl team set <org-id>

# 3. Check what's configured
nucl team show

# 4. Run a job
nucl run --name "my-project/first-test" --script train.py --gpu-type t4

# 5. Monitor it
nucl ps
nucl log <job-id> -f
nucl pull <job-id> ./outputs

For admins: set up a team

You need az and/or gcloud CLI installed and authenticated.

# 1. Log in
nucl auth login

# 2. Pick your team
nucl team list
nucl team set <org-id>

# 3. Run the interactive setup wizard
nucl team setup

The wizard will:

  • List your Azure subscriptions and ML workspaces (or GCP projects)
  • Create a service principal (Azure) or service account (Vertex) for NUCL
  • If you lack Owner permissions for role assignment, it prints the exact command for an admin to run
  • Save encrypted credentials to NUCL (they never touch anyone's local machine)

You can also configure credentials manually:

nucl team config azure
nucl team config vertex

Running a sample job

Create a train.py:

import time

print("Starting training...")
for epoch in range(5):
    loss = 1.0 / (epoch + 1)
    print(f"Epoch {epoch}: loss={loss:.4f}")
    time.sleep(1)
print("Done!")

Run it on Sandbox (no GPU, no cloud account needed):

nucl run --name "test/hello-world" --script train.py
nucl ps
nucl log <job-id> -f

Run it on Azure ML with a T4 GPU:

nucl run --name "test/gpu-test" --script train.py --gpu-type t4

Experiment naming

Use / to organize experiments into folders:

lung-cancer/detection/yolov9-baseline
lung-cancer/detection/yolov9-augmented
breast-cancer/screening/resnet-50

Filter by prefix: nucl ps lung-cancer/detection/

In-job logging

NUCL does not ship a custom SDK. Use MLflow directly:

import mlflow

mlflow.log_param("learning_rate", 0.001)
mlflow.log_metric("accuracy", 0.95)
mlflow.log_artifact("model.pth")

Both Azure ML and Vertex AI natively support MLflow.

CLI Reference

nucl auth login|logout|status       Auth
nucl team list|show|set|setup       Teams
nucl team config azure|vertex       Manual credential entry
nucl run --name --script [--gpu-type]  Submit job
nucl ps [prefix]                    List jobs
nucl log <id> [-f]                  Stream logs
nucl stop <id>                      Cancel job
nucl pull <id> [target]             Download outputs
nucl model ls|pull                  Models
nucl hpo run <config.yaml>          HPO sweeps
nucl mcp serve                      MCP server for AI agents

MCP Server for AI Agents

NUCL ships an MCP server so AI agents (Claude, Cursor, etc.) can submit jobs, check status, and pull results.

Quick setup

npx add-mcp nucl -- nucl mcp serve

This registers nucl mcp serve as an MCP server in your editor. Make sure you're logged in (nucl auth login) and have a team set (nucl team set <org-id>) before starting.

Manual setup

Add to your MCP config (e.g. .claude/settings.json, .cursor/mcp.json):

{
  "mcpServers": {
    "nucl": {
      "command": "nucl",
      "args": ["mcp", "serve"]
    }
  }
}

Available tools

The MCP server exposes all CLI operations: nucl_auth_status, nucl_team_list, nucl_team_show, nucl_team_set, nucl_run, nucl_ps, nucl_log, nucl_stop, nucl_pull, nucl_model_ls, nucl_model_pull, nucl_hpo_run, and team config tools.

Tech Stack

Layer Technology
CLI Python 3.11+, Click, httpx
Web Next.js 16, React 19, TypeScript 6
UI shadcn, Tailwind CSS 4, TanStack Table
Data fetching TanStack Query 5
Auth Clerk 7 (Organizations, API keys)
Job submission Vercel Sandbox (Python 3.13)
Encryption AES-256-GCM
Package management uv (Python), Bun (JS)

Deploying the Web Dashboard

cd web
bun install
bun dev

Environment variables (set in Vercel):

NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_...
CLERK_SECRET_KEY=sk_...
NEXT_PUBLIC_CLERK_SIGN_IN_URL=/sign-in
NEXT_PUBLIC_CLERK_SIGN_UP_URL=/sign-up
ENCRYPTION_KEY=<openssl rand -hex 32>
VERCEL_TOKEN=...

License

Internal use only.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nucl-0.10.0.tar.gz (20.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nucl-0.10.0-py3-none-any.whl (14.4 kB view details)

Uploaded Python 3

File details

Details for the file nucl-0.10.0.tar.gz.

File metadata

  • Download URL: nucl-0.10.0.tar.gz
  • Upload date:
  • Size: 20.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for nucl-0.10.0.tar.gz
Algorithm Hash digest
SHA256 aa8d583b4337b3cb8cf6e437a8f9bbd2c34269d1c0048ace3221c6ebf8c413a0
MD5 e7a50098c55e9662314c815472ba41b0
BLAKE2b-256 d2898df9887d8e04d9cdc994b6ae5ef74c7c30801807d64c945ee80204cef04a

See more details on using hashes here.

Provenance

The following attestation bundles were made for nucl-0.10.0.tar.gz:

Publisher: publish-cli.yml on lunit-io/nucl

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nucl-0.10.0-py3-none-any.whl.

File metadata

  • Download URL: nucl-0.10.0-py3-none-any.whl
  • Upload date:
  • Size: 14.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for nucl-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3572acdb29e29796a1c4434046d34b6cdc96991fd9c985084708eb44175415c1
MD5 e8ccd376d8ae3d117cd66fd5ef455454
BLAKE2b-256 458dabf815a19eab1cd5e15e7a688fc5c89b85928946965d6291e0617efc293f

See more details on using hashes here.

Provenance

The following attestation bundles were made for nucl-0.10.0-py3-none-any.whl:

Publisher: publish-cli.yml on lunit-io/nucl

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page