Minimal ML experiment platform wrapping Azure ML and Vertex AI
Project description
NUCL
Minimal ML experiment platform. 2,000 lines of code.
NUCL wraps Azure ML and Vertex AI behind a unified CLI and web dashboard. No servers to manage, no databases to maintain, no collectors to deploy. Every feature delegates to a managed service.
nucl run --name "vision/resnet-50-v2" --script train.py --gpu-type t4
nucl ps vision/
nucl log vision/resnet-50-v2 -f
nucl pull vision/resnet-50-v2
Architecture
graph LR
subgraph Clients
CLI["CLI (Python)"]
Web["Web Dashboard"]
end
subgraph Vercel
API["Next.js API Routes"]
Auth["Clerk (auth + API keys)"]
end
subgraph Sandbox["Vercel Sandbox (Python 3.13)"]
AzureSDK["azure-ai-ml SDK"]
VertexSDK["vertex AI SDK"]
end
subgraph Platforms
AzureML["Azure ML"]
VertexAI["Vertex AI"]
end
CLI -- HTTP --> API
Web -- HTTP --> API
API --> Auth
API -- "job submission" --> Sandbox
API -. "read ops (list, logs, cancel)" .-> Platforms
AzureSDK --> AzureML
VertexSDK --> VertexAI
Job submission spins up a short-lived Vercel Sandbox with Python 3.13 and uses the official cloud SDKs to upload code and create training jobs. Read operations (list, logs, cancel) use direct REST API calls.
Two platforms:
| Platform | GPU | Use case |
|---|---|---|
| Azure ML | Yes | Production training on Azure |
| Vertex AI | Yes | Production training on GCP |
Getting Started
Install the CLI
uv tool install nucl
nucl --help
For users: join a team and run jobs
Your team admin will have already configured cloud credentials. You just need to log in and start running jobs.
# 1. Log in (opens browser)
nucl auth login
# 2. See your teams and pick one
nucl team list
nucl team set <org-id>
# 3. Check what's configured
nucl team show
# 4. Run a job
nucl run --name "my-project/first-test" --script train.py --gpu-type t4
# 5. Monitor it
nucl ps --platform all
nucl log <job-id> -f
nucl pull <job-id> ./outputs
For admins: set up a team
You need az and/or gcloud CLI installed and authenticated.
# 1. Log in
nucl auth login
# 2. Pick your team
nucl team list
nucl team set <org-id>
# 3. Run the interactive setup wizard
nucl team setup
The wizard will:
- List your Azure subscriptions and ML workspaces (or GCP projects)
- Create a service principal (Azure) or service account (Vertex) for NUCL
- If you lack Owner permissions for role assignment, it prints the exact command for an admin to run
- Save encrypted credentials to NUCL (they never touch anyone's local machine)
Cloud permissions
NUCL's cloud identities must be able to submit jobs, not just read config.
- Azure ML: the NUCL service principal must be able to create jobs on the target workspace. In practice, if submissions fail with
AuthorizationFailedon.../workspaces/jobs/write, grant the principalContributoron the Azure ML workspace or the containing resource group. - Vertex AI: the NUCL service account must be allowed to create
CustomJobresources in the target project and region. If your org requires an explicit staging bucket, configure one during team setup or manual configuration and make sure the service account can write to it.
You can also configure credentials manually:
nucl team config azure
nucl team config vertex
Running a sample job
Create a train.py:
import time
print("Starting training...")
for epoch in range(5):
loss = 1.0 / (epoch + 1)
print(f"Epoch {epoch}: loss={loss:.4f}")
time.sleep(1)
print("Done!")
Run it on Azure ML with a T4 GPU:
nucl run --name "test/gpu-test" --script train.py --gpu-type t4
nucl ps --platform azure
nucl log <job-id> -f
Experiment naming
Use / to organize experiments into folders:
lung-cancer/detection/yolov9-baseline
lung-cancer/detection/yolov9-augmented
breast-cancer/screening/resnet-50
Filter by prefix: nucl ps lung-cancer/detection/
You can also filter by platform:
nucl ps --platform all
nucl ps --platform vertex
nucl ps --platform azure
The web dashboard lists all experiments by default. Use the CLI --platform flag when you want to narrow the view to a specific backend.
In-job logging
NUCL does not ship a custom SDK. Use MLflow directly:
import mlflow
mlflow.log_param("learning_rate", 0.001)
mlflow.log_metric("accuracy", 0.95)
mlflow.log_artifact("model.pth")
Both Azure ML and Vertex AI natively support MLflow.
CLI Reference
nucl auth login|logout|status Auth
nucl team list|show|set|setup Teams
nucl team config azure|vertex Manual credential entry
nucl run --name --script [--gpu-type] Submit job
nucl ps [prefix] [--platform ...] List jobs
nucl log <id> [-f] Stream logs
nucl stop <id> Cancel job
nucl pull <id> [target] Download outputs
nucl mcp serve MCP server for AI agents
MCP Server for AI Agents
NUCL ships an MCP server so AI agents (Claude, Cursor, etc.) can submit jobs, check status, and pull results.
Quick setup
bunx add-mcp "nucl mcp serve" --name nucl
This detects your installed agents (Claude Code, Cursor, etc.) and registers NUCL as an MCP server. Make sure you're logged in (nucl auth login) and have a team set (nucl team set <org-id>) first.
Available tools
The MCP server exposes all CLI operations: nucl_auth_status, nucl_team_list, nucl_team_show, nucl_team_set, nucl_run, nucl_ps, nucl_log, nucl_stop, nucl_pull, and team config tools.
Tech Stack
| Layer | Technology |
|---|---|
| CLI | Python 3.11+, Click, httpx |
| Web | Next.js 16, React 19, TypeScript 6 |
| UI | shadcn, Tailwind CSS 4, TanStack Table |
| Data fetching | TanStack Query 5 |
| Auth | Clerk 7 (Organizations, API keys) |
| Job submission | Vercel Sandbox (Python 3.13) |
| Encryption | AES-256-GCM |
| Package management | uv (Python), Bun (JS) |
Deploying the Web Dashboard
cd web
bun install
bun dev
Environment variables (set in Vercel):
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=pk_...
CLERK_SECRET_KEY=sk_...
NEXT_PUBLIC_CLERK_SIGN_IN_URL=/sign-in
NEXT_PUBLIC_CLERK_SIGN_UP_URL=/sign-up
ENCRYPTION_KEY=<openssl rand -hex 32>
VERCEL_TOKEN=...
License
Internal use only.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nucl-0.19.0.tar.gz.
File metadata
- Download URL: nucl-0.19.0.tar.gz
- Upload date:
- Size: 19.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
787aa4610a3d3d340a5d5ab199aa2617b9d9cc487188a5b208baa303b0e0c650
|
|
| MD5 |
953e2384e6a3c582799d74d77c0888cc
|
|
| BLAKE2b-256 |
0c5ff9ccdb13468ade9ddf569c45ff2207a8ee4ff051cbf9c731cd70f1956964
|
Provenance
The following attestation bundles were made for nucl-0.19.0.tar.gz:
Publisher:
publish-cli.yml on lunit-io/nucl
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nucl-0.19.0.tar.gz -
Subject digest:
787aa4610a3d3d340a5d5ab199aa2617b9d9cc487188a5b208baa303b0e0c650 - Sigstore transparency entry: 1328062073
- Sigstore integration time:
-
Permalink:
lunit-io/nucl@22d393e6517829f3094ddaf2fc340cbb99fe2e92 -
Branch / Tag:
refs/tags/v0.19.0 - Owner: https://github.com/lunit-io
-
Access:
internal
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-cli.yml@22d393e6517829f3094ddaf2fc340cbb99fe2e92 -
Trigger Event:
push
-
Statement type:
File details
Details for the file nucl-0.19.0-py3-none-any.whl.
File metadata
- Download URL: nucl-0.19.0-py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
185206eb85503f34dab980c73fe059080cebc305a4eb6ef8fef644da3df9e8af
|
|
| MD5 |
fb63069129eadd60162bb183cc737263
|
|
| BLAKE2b-256 |
04568efbd288a9facf03e98dbf3a60da8c56484efbeeab81301b77cd31e8698d
|
Provenance
The following attestation bundles were made for nucl-0.19.0-py3-none-any.whl:
Publisher:
publish-cli.yml on lunit-io/nucl
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nucl-0.19.0-py3-none-any.whl -
Subject digest:
185206eb85503f34dab980c73fe059080cebc305a4eb6ef8fef644da3df9e8af - Sigstore transparency entry: 1328062076
- Sigstore integration time:
-
Permalink:
lunit-io/nucl@22d393e6517829f3094ddaf2fc340cbb99fe2e92 -
Branch / Tag:
refs/tags/v0.19.0 - Owner: https://github.com/lunit-io
-
Access:
internal
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-cli.yml@22d393e6517829f3094ddaf2fc340cbb99fe2e92 -
Trigger Event:
push
-
Statement type: