Adapts algorithms that implement the Grand Challenge inference API for running in SageMaker
Project description
SageMaker Shim for Grand Challenge
This repo contains a library that adapts algorithms that implement the Grand Challenge inference API for running in SageMaker.
The application contains:
- A
click
cli client with options to launch a web server - A
fastapi
web server that implements the SageMaker endpoints - and
pydantic
models that interface between S3, and run the original inference jobs.
The application is compiled on Python 3.12 using pyinstaller
, and then distributed as a statically linked binary using staticx
.
It is able to adapt any container, including ones based on scratch
or alpine
images.
Usage
The binary is designed to be added to an existing container image that implements the Grand Challenge API. On Grand Challenge this happens automatically by using crane to add the binary, directories and environment variables to each container image. The binary itself will:
- Download the input files from the provided locations on S3 to
/input
, optionally decompressing the inputs. - Execute the original container program in a subprocess.
This is found by inspecting the following environment variables:
GRAND_CHALLENGE_COMPONENT_CMD_B64J
: the originalcmd
of the container, json encoded as a base64 string.GRAND_CHALLENGE_COMPONENT_ENTRYPOINT_B64J
: the originalentrypoint
of the container, json encoded as a base64 string.
- Upload the contents of
/output
to the given output S3 bucket and prefix.
Logging
CloudWatch does not offer separation of stdout
and stderr
by default.
sagemaker-shim
includes a logging filter and formatter that creates structured logs from the application and subprocess.
This allows grand challenge to separate out internal, external, stdout and stderr streams.
These structured logs are JSON objects with the format:
{
"log": "", // The original log message
"level": "CRITICAL" | "ERROR" | "WARNING" | "INFO" | "DEBUG" | "NOTSET", // The severity level of the log
"source": "stdout" | "stderr", // The source stream
"internal": true | false, // Whether the source of the log is from sagemaker shim or the subprocess
"task": "" | null, // The ID of the task
}
sagemaker-shim serve
This starts the webserver on http://0.0.0.0:8080 which implements the SageMaker API. There are three endpoints:
-
/ping
(GET): returns an empty 200 response if the container is healthy -
/execution-parameters
(GET): returns the preferred execution parameters for AWS SageMaker Batch Inference -
/invocations
(POST): SageMaker can make POST requests to this endpoint. The body contains the json encoded data required to run a single inference task:{ "pk": "unique-test-id", "inputs": [ { "relative_path": "interface/path", "bucket_name": "name-of-input-bucket", "bucket_key": "/path/to/input/file/in/bucket", "decompress": false, }, ... ], "output_bucket_name": "name-of-output-bucket", "output_prefix": "/prefix/of/output/files", }
The endpoint will return an object containing the return code of the subprocess in
response["return_code"]
, and any outputs will be placed in the output bucket at the output prefix. A file with the inference outputs will also be located ats3://<output_bucket_name>/<output_prefix>/.sagemaker_shim/inference_result.json
sagemaker-shim invoke
This will invoke the model directly given the arguments. You can specify either:
-f / --file
: S3 URI of a JSON file containing a list of task definitions, e.g.s3://my-bucket/invocations.json
-t / --tasks
: A JSON string of task definitions
In both cases the contents of the file or string will be an array of task objects:
[
{
"pk": "unique-test-id-1",
"inputs": [
...
],
"output_bucket_name": "name-of-output-bucket",
"output_prefix": "/prefix/of/output/files-1",
},
{
"pk": "unique-test-id-2",
"inputs": [
...
],
"output_bucket_name": "name-of-output-bucket",
"output_prefix": "/prefix/of/output/files-2",
}
]
A file with the inference outputs will be located at s3://<output_bucket_name>/<output_prefix>/.sagemaker_shim/inference_result.json
.
Patching an Existing Container
To patch an existing container image in a registry see the example in tests/utils.py.
First you will need to get the original cmd
and entrypoint
using get_new_env_vars
and get_image_config
.
Then you can add the binary, set the new cmd
, entrypoint
, and environment variables with mutate_image
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sagemaker_shim-0.3.5.tar.gz
.
File metadata
- Download URL: sagemaker_shim-0.3.5.tar.gz
- Upload date:
- Size: 20.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.4 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4749a31bb28d80d5d337b3c5021fd62c0cc31dd4131e2f4a13cdd02716632912 |
|
MD5 | e2b4bb6ab1265c7e925d27773cf5e6b6 |
|
BLAKE2b-256 | dcc99d22a6cf41b286249e4c25ebc14fc140fa6f2637ec8d141198c288078292 |
File details
Details for the file sagemaker_shim-0.3.5-py3-none-any.whl
.
File metadata
- Download URL: sagemaker_shim-0.3.5-py3-none-any.whl
- Upload date:
- Size: 21.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.4 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ccbfbd1d668f5c493583281353b2ed899150dd47cf41684599e48aae1af7cbd6 |
|
MD5 | 0fdb80f9ba7d734f0c924220c665edd4 |
|
BLAKE2b-256 | 9b1c52d6651e810592c81fbc0a747f333bd562e10be1adb294474b5afcb77528 |