Skip to main content

ALCF Inference Gateway SDK

Project description

ALCF AI Inference Services SDK

This package provides Python client and CLI tools to facilitate usage of the ALCF AI Inference services.

Command Line Usage

Quick Start

# Log in with Globus:
uvx alcf-ai auth login

# Chat with a model
# The default --model is meta-llama/Llama-4-Scout-17B-16E-Instruct
uvx alcf-ai chat "How do I know Pi is irrational? Be concise."

Auth

# Login for Inference Service only:
uvx alcf-ai auth login

# Login for Inference+Globus data transfers
# (append :data_access only if required for your collection)
SOURCE_COLLECTION="your globus collection UUID"
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

# Get an access token to use externally:
token=$(uvx alcf-ai auth get-access-token)
curl -H "Authorization: Bearer $token" https://inference-api.alcf.anl.gov/resource_server/list-endpoints | jq

Discovering Models

To list the models and corresponding API endpoints that are currently available, use:

uvx alcf-ai ls-endpoints

To view the status of models that are currently hot or starting up on a cluster, use:

# Can substitute "sophia" with "metis"
uvx alcf-ai ls-jobs sophia

Chat with an LLM

# See detailed options:
uvx alcf-ai chat --help

# For example:
uvx alcf-ai chat --model google/gemma-4-31B-it --stream --temp 0.3 --max-tokens 100 "What is KL divergence? Answer in less than 75 words."

Segment images with SAM3

You can segment your images with the Meta SAM3 model.

Send a single image URI plus prompt in for segmentation:

uvx alcf-ai sam3 submit-image \
  https://raw.githubusercontent.com/masalim2/sam3-service/refs/heads/main/examples/images/groceries.jpg \
  "Baguette" \
  --save-preview ~/test-baguettes.png

Batch Processing

For high-throughput, preprocess and bundle your images and prompts in the WebDataset format using the built-in CLI tool:

# Bundle all .tiff files in directory with 3 prompts Creates WebDataset tar
# files in --output-dir, with 100 images per .tar.
alcf-ai sam3 create-webdataset \
   /path/to/tiff-stack \
   .tiff \
    "Phloem Fibers" "Hydrated Xylem vessels" "Air-based Pith cells" \
    --output-dir test-wds --shard-size=100 --num-workers=4

If the dataset is on a Globus Collection, you can authorize the CLI to send them to the inference service:

# Look up the UUID of your collection:
SOURCE_COLLECTION="your globus collection UUID"

# Append ":data_access" if this scope is required:
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

Then use the tool to drive data staging and batch inference:

SAM3_FINETUNE=/eagle/inference_service/sam3-service/weights/synaps-i
SECONDS=0

for f in test-wds/*.tar
do
uvx alcf-ai sam3 submit-batch $SOURCE_COLLECTION $f --weights-dir-override $SAM3_FINETUNE >> batch-inference.log 2>&1 &
done
wait
echo "Completed in $SECONDS seconds."

You can preview the segmentation results in a batch by passing the paths to the input and result tar files:

uvx  alcf-ai sam3 preview-batch-results shard-00004.tar shard-00004.results.tar

Installing the latest client version

You can force an install of the latest version and verify your local version using:

uvx alcf-ai@latest version

SDK Usage

You can use pip install alcf-ai or uv run --with-alcf python to add the SDK to your environment:

uv run --with alcf-ai python

OpenAI Client

Use alcf_ai.InferenceClient to construct an OpenAI client for any ALCF-backed cluster. This reuses your auth and ensures that requests are sent to the right URL:

from alcf_ai import InferenceClient
from rich import print

# Automatically uses cached refresh tokens from previous login:
client = InferenceClient()

# Programmatically discover endpoints:
print(client.list_endpoints()["clusters"]["sophia"])

# Get an OpenAI API client for an ALCF cluster:
oai = client.clusters("sophia").openai
print(
    oai.chat.completions.create(
        model="openai/gpt-oss-120b",
        messages=[{"role": "user", "content": "Hello there!"}],
    )
)

Data Movement and SAM3

You can use the same InferenceClient to move data in and out of a Globus Guest Collection that's managed by the service. Your data is stored in an ephemeral staging subdirectory, with ACLs that grant only your Globus identity read/write access to it.

from alcf_ai import InferenceClient
from alcf_ai.auth import STAGING_COLLECTION_ROOT
client = InferenceClient()

dataset_path = Path("/path/to/my-dataset.tar")
collection_id="globus collection uuid"

# Stage in data:
stagein = client.stage_in(collection_id, dataset_path, dataset_path.name)

# Submit SAM3 inference:
resp = client.sam3.submit_batch(
    STAGING_COLLECTION_ROOT + str(stagein.destination_path)
)

# Wait for inference:
result = client.sam3.poll_task_result(resp.task_id)

# Copy results back:
client.stage_out(
    collection_id,
    Path(result.result_path).name,
    dataset_path.with_suffix(".results.tar"),
)

Using an alternate service URL

The client with both programmatic and CLI usage defaults to the ALCF Inference Service production base url of https://inference-api.alcf.anl.gov/resource_server/. This can be altered in a few ways:

  1. By exporting the inference_base_url environment variable
  2. From the CLI, passing an optional --base-url to the alcf-ai subcommand.
  3. From the Python client, passing the kwarg InferenceClient(base_url="...")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alcf_ai-0.3.5.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alcf_ai-0.3.5-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file alcf_ai-0.3.5.tar.gz.

File metadata

  • Download URL: alcf_ai-0.3.5.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.3.5.tar.gz
Algorithm Hash digest
SHA256 0f59cdbe095b77e74f18ac7eccba1c00d2dd6efc2cf5651d3cb05659786f2827
MD5 9a55ac537baa045f7bf1ba9e0e57d998
BLAKE2b-256 eaf921ef0edb6c056ac1a8b24bc5870fb0d47d7b5a97d18d0d4490dce14be1c4

See more details on using hashes here.

File details

Details for the file alcf_ai-0.3.5-py3-none-any.whl.

File metadata

  • Download URL: alcf_ai-0.3.5-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.3.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b7e8d409896cb493fb8dec481f14809104f1dfb175336577284c848c22fb4fab
MD5 165f89b1252077fee16c0671b084d5e1
BLAKE2b-256 aeed61f6fa391af8463a430c18bd0f620162a63cc40cc197547174ce0c5ae01c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page