Skip to main content

ALCF Inference Gateway SDK

Project description

ALCF AI Inference Services SDK

This package provides Python client and CLI tools to facilitate usage of the ALCF AI Inference services.

Command Line Usage

Quick Start

# Log in with Globus:
uvx alcf-ai auth login

# Chat with a model
# The default --model is meta-llama/Llama-4-Scout-17B-16E-Instruct
uvx alcf-ai chat "How do I know Pi is irrational? Be concise."

Auth

# Login for Inference Service only:
uvx alcf-ai auth login

# Login for Inference+Globus data transfers
# (append :data_access only if required for your collection)
SOURCE_COLLECTION="your globus collection UUID"
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

# Get an access token to use externally:
token=$(uvx alcf-ai auth get-access-token)
curl -H "Authorization: Bearer $token" https://inference-api.alcf.anl.gov/resource_server/list-endpoints | jq

Discovering Models

To list the models and corresponding API endpoints that are currently available, use:

uvx alcf-ai ls-endpoints

To view the status of models that are currently hot or starting up on a cluster, use:

# Can substitute "sophia" with "metis"
uvx alcf-ai ls-jobs sophia

Chat with an LLM

# See detailed options:
uvx alcf-ai chat --help

# For example:
uvx alcf-ai chat --model google/gemma-4-31B-it --stream --temp 0.3 --max-tokens 100 "What is KL divergence? Answer in less than 75 words."

Segment images with SAM3

You can segment your images with the Meta SAM3 model.

Send a single image URI plus prompt in for segmentation:

uvx alcf-ai sam3 submit-image \
  https://raw.githubusercontent.com/masalim2/sam3-service/refs/heads/main/examples/images/groceries.jpg \
  "Baguette" \
  --save-preview ~/test-baguettes.png

Batch Processing

For high-throughput, preprocess and bundle your images and prompts in the WebDataset format using the built-in CLI tool:

# Bundle all .tiff files in directory with 3 prompts Creates WebDataset tar
# files in --output-dir, with 100 images per .tar.
alcf-ai sam3 create-webdataset \
   /path/to/tiff-stack \
   .tiff \
    "Phloem Fibers" "Hydrated Xylem vessels" "Air-based Pith cells" \
    --output-dir test-wds --shard-size=100 --num-workers=4

If the dataset is on a Globus Collection, you can authorize the CLI to send them to the inference service:

# Look up the UUID of your collection:
SOURCE_COLLECTION="your globus collection UUID"

# Append ":data_access" if this scope is required:
uvx alcf-ai auth login --authorize-transfers $SOURCE_COLLECTION:data_access

Then use the tool to drive data staging and batch inference:

SAM3_FINETUNE=/eagle/inference_service/sam3-service/weights/synaps-i
SECONDS=0

for f in test-wds/*.tar
do
uvx alcf-ai sam3 submit-batch $SOURCE_COLLECTION $f --weights-dir-override $SAM3_FINETUNE >> batch-inference.log 2>&1 &
done
wait
echo "Completed in $SECONDS seconds."

You can preview the segmentation results in a batch by passing the paths to the input and result tar files:

uvx  alcf-ai sam3 preview-batch-results shard-00004.tar shard-00004.results.tar

SDK Usage

You can use pip install alcf-ai or uv run --with-alcf python to add the SDK to your environment:

uv run --with alcf-ai python

OpenAI Client

Use alcf_ai.InferenceClient to construct an OpenAI client for any ALCF-backed cluster. This reuses your auth and ensures that requests are sent to the right URL:

from alcf_ai import InferenceClient
from rich import print

# Automatically uses cached refresh tokens from previous login:
client = InferenceClient()

# Programmatically discover endpoints:
print(client.list_endpoints()["clusters"]["sophia"])

# Get an OpenAI API client for an ALCF cluster:
oai = client.clusters("sophia").openai
print(
    oai.chat.completions.create(
        model="openai/gpt-oss-120b",
        messages=[{"role": "user", "content": "Hello there!"}],
    )
)

Data Movement and SAM3

You can use the same InferenceClient to move data in and out of a Globus Guest Collection that's managed by the service. Your data is stored in an ephemeral staging subdirectory, with ACLs that grant only your Globus identity read/write access to it.

from alcf_ai import InferenceClient
from alcf_ai.auth import STAGING_COLLECTION_ROOT
client = InferenceClient()

dataset_path = Path("/path/to/my-dataset.tar")
collection_id="globus collection uuid"

# Stage in data:
stagein = client.stage_in(collection_id, dataset_path, dataset_path.name)

# Submit SAM3 inference:
resp = client.sam3.submit_batch(
    STAGING_COLLECTION_ROOT + str(stagein.destination_path)
)

# Wait for inference:
result = client.sam3.poll_task_result(resp.task_id)

# Copy results back:
client.stage_out(
    collection_id,
    Path(result.result_path).name,
    dataset_path.with_suffix(".results.tar"),
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alcf_ai-0.3.0.tar.gz (14.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alcf_ai-0.3.0-py3-none-any.whl (17.8 kB view details)

Uploaded Python 3

File details

Details for the file alcf_ai-0.3.0.tar.gz.

File metadata

  • Download URL: alcf_ai-0.3.0.tar.gz
  • Upload date:
  • Size: 14.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.8 {"installer":{"name":"uv","version":"0.10.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.3.0.tar.gz
Algorithm Hash digest
SHA256 9666525d57de56547018346cdce3c78eeb886a7c599e8771b0383c6c120e7694
MD5 abb6087f2be8d62d143d96e1982892ff
BLAKE2b-256 169f24f278898bf92babac427ddd1bac66fcdce9ac9c689feffc57c05e464cbf

See more details on using hashes here.

File details

Details for the file alcf_ai-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: alcf_ai-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 17.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.8 {"installer":{"name":"uv","version":"0.10.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for alcf_ai-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3305c514a3eedadd09e154ef181cbf246ca442fd5dcd05138550e396771ea7a9
MD5 23065b9882b5b57fdffa9faa8473e4e2
BLAKE2b-256 10be11ede7233f638902f765391a08853121350d8bb17e10eaf079680391dc6a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page