Skip to main content

ALCF Inference Gateway SDK

Project description

ALCF AI Inference Services SDK

This package provides Python client and CLI tools to facilitate usage of the ALCF AI Inference services.

Command Line Usage

Quick Start

# Log in with Globus:
uvx alcf-ai auth login

# Chat with a model
# The default --model is meta-llama/Llama-4-Scout-17B-16E-Instruct
uvx alcf-ai chat "How do I know Pi is irrational? Be concise."

Auth

# Login for Inference Service only:
uvx alcf-ai auth login

# Login for Inference+Globus data transfers
# (append :data_access only if required for your collection)
SOURCE_COLLECTION="your globus collection UUID"
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

# Get an access token to use externally:
token=$(uvx alcf-ai auth get-access-token)
curl -H "Authorization: Bearer $token" https://inference-api.alcf.anl.gov/resource_server/list-endpoints | jq

Discovering Models

To list the models and corresponding API endpoints that are currently available, use:

uvx alcf-ai ls-endpoints

To view the status of models that are currently hot or starting up on a cluster, use:

# Can substitute "sophia" with "metis"
uvx alcf-ai ls-jobs sophia

Chat with an LLM

# See detailed options:
uvx alcf-ai chat --help

# For example:
uvx alcf-ai chat --model google/gemma-4-31B-it --stream --temp 0.3 --max-tokens 100 "What is KL divergence? Answer in less than 75 words."

Segment images with SAM3

You can segment your images with the Meta SAM3 model.

Send a single image URI plus prompt in for segmentation:

uvx alcf-ai sam3 submit-image \
  https://raw.githubusercontent.com/masalim2/sam3-service/refs/heads/main/examples/images/groceries.jpg \
  "Baguette" \
  --save-preview ~/test-baguettes.png

Batch Processing

For high-throughput, preprocess and bundle your images and prompts in the WebDataset format using the built-in CLI tool:

# Bundle all .tiff files in directory with 3 prompts Creates WebDataset tar
# files in --output-dir, with 100 images per .tar.
alcf-ai sam3 create-webdataset \
   /path/to/tiff-stack \
   .tiff \
    "Phloem Fibers" "Hydrated Xylem vessels" "Air-based Pith cells" \
    --output-dir test-wds --shard-size=100 --num-workers=4

If the dataset is on a Globus Collection, you can authorize the CLI to send them to the inference service:

# Look up the UUID of your collection:
SOURCE_COLLECTION="your globus collection UUID"

# Append ":data_access" if this scope is required:
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

Then use the tool to drive data staging and batch inference:

SAM3_FINETUNE=/eagle/inference_service/sam3-service/weights/synaps-i
SECONDS=0

for f in test-wds/*.tar
do
uvx alcf-ai sam3 submit-batch $f --from-collection-id $SOURCE_COLLECTION --weights-dir-override $SAM3_FINETUNE >> batch-inference.log 2>&1 &
done
wait
echo "Completed in $SECONDS seconds."

You can preview the segmentation results in a batch by passing the paths to the input and result tar files:

uvx  alcf-ai sam3 preview-batch-results shard-00004.tar shard-00004.results.tar

Installing the latest client version

You can force an install of the latest version and verify your local version using:

uvx alcf-ai@latest version

SDK Usage

You can use pip install alcf-ai or uv run --with-alcf python to add the SDK to your environment:

uv run --with alcf-ai python

OpenAI Client

Use alcf_ai.InferenceClient to construct an OpenAI client for any ALCF-backed cluster. This reuses your auth and ensures that requests are sent to the right URL:

from alcf_ai import InferenceClient
from rich import print

# Automatically uses cached refresh tokens from previous login:
client = InferenceClient()

# Programmatically discover endpoints:
print(client.list_endpoints()["clusters"]["sophia"])

# Get an OpenAI API client for an ALCF cluster:
oai = client.clusters("sophia").openai
print(
    oai.chat.completions.create(
        model="openai/gpt-oss-120b",
        messages=[{"role": "user", "content": "Hello there!"}],
    )
)

Data Movement and SAM3

You can use the same InferenceClient to move data in and out of a Globus Guest Collection that's managed by the service. Your data is stored in an ephemeral staging subdirectory, with ACLs that grant only your Globus identity read/write access to it.

from alcf_ai import InferenceClient
from alcf_ai.auth import STAGING_COLLECTION_ROOT
client = InferenceClient()

dataset_path = Path("/path/to/my-dataset.tar")
collection_id="globus collection uuid"

# Stage in data:
stagein = client.stage_in(collection_id, dataset_path, dataset_path.name)

# Submit SAM3 inference:
resp = client.sam3.submit_batch(
    STAGING_COLLECTION_ROOT + str(stagein.destination_path)
)

# Wait for inference:
result = client.sam3.poll_task_result(resp.task_id)

# Copy results back:
client.stage_out(
    collection_id,
    Path(result.result_path).name,
    dataset_path.with_suffix(".results.tar"),
)

Using an alternate service URL

The client with both programmatic and CLI usage defaults to the ALCF Inference Service production base url of https://inference-api.alcf.anl.gov/resource_server/. This can be altered in a few ways:

  1. By exporting the inference_base_url environment variable
  2. From the CLI, passing an optional --base-url to the alcf-ai subcommand.
  3. From the Python client, passing the kwarg InferenceClient(base_url="...")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alcf_ai-0.4.0.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alcf_ai-0.4.0-py3-none-any.whl (18.8 kB view details)

Uploaded Python 3

File details

Details for the file alcf_ai-0.4.0.tar.gz.

File metadata

  • Download URL: alcf_ai-0.4.0.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.4.0.tar.gz
Algorithm Hash digest
SHA256 5d830b38800a25246ed5290f248a80e6af88115eb321430167d578ce02d73263
MD5 3bd17a3d3a06c71a19850359940aeafd
BLAKE2b-256 5c65c8918625c0d417e3d8ac7157b09c626d718fcbbba16e316c39f4c9f5ba11

See more details on using hashes here.

File details

Details for the file alcf_ai-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: alcf_ai-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 18.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 334c06c5f5c8455317f1e4974d6306580a92038af4d598feae7724845cc62012
MD5 69bd029443805e0c7c72bb60286c6d4d
BLAKE2b-256 766e58c42c2d68c7dafa5569d6dbb2c01ae9025eafb25e243cae7e71deeafb45

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page