Skip to main content

Adapts algorithms that implement the Grand Challenge inference API for running in SageMaker

Project description

SageMaker Shim for Grand Challenge

CI PyPI PyPI - Python Version Code style: black

This repo contains a library that adapts algorithms that implement the Grand Challenge inference API for running in SageMaker.

The application contains:

  • A click cli client with options to launch a web server
  • A fastapi web server that implements the SageMaker endpoints
  • and pydantic models that interface between S3, and run the original inference jobs.

The application is compiled on Python 3.10 using pyinstaller, and then distributed as a statically linked binary using staticx. It is able to adapt any container, including ones based on scratch or alpine images.

Usage

The binary is designed to be added to an existing container image that implements the Grand Challenge API. On Grand Challenge this happens automatically by using crane to add the binary, directories and environment variables to each comtainer image. The binary itself will:

  1. Download the input files from the provided locations on S3 to /input, optionally decompressing the inputs.
  2. Execute the original container program in a subprocess. This is found by inspecting the following environment variables:
    • GRAND_CHALLENGE_COMPONENT_CMD_B64J: the original cmd of the container, json encoded as a base64 string.
    • GRAND_CHALLENGE_COMPONENT_ENTRYPOINT_B64J: the original entrypoint of the container, json encoded as a base64 string.
  3. Upload the contents of /output to the given output S3 bucket and prefix.

sagemaker-shim serve

This starts the webserver on http://0.0.0.0:8080 which implements the SageMaker API. There are three endpoints:

  • /ping (GET): returns an empty 200 response if the container is healthy

  • /execution-parameters (GET): returns the preferred execution parameters for AWS SageMaker Batch Inference

  • /invocations (POST): SageMaker can make POST requests to this endpoint. The body contains the json encoded data required to run a single inference task:

      {
          "pk": "unique-test-id",
          "inputs": [
              {
                  "relative_path": "interface/path",
                  "bucket_name": "name-of-input-bucket",
                  "bucket_key": "/path/to/input/file/in/bucket",
                  "decompress": false,
              },
              ...
          ],
          "output_bucket_name": "name-of-output-bucket",
          "output_prefix": "/prefix/of/output/files",
      }
    

    The endpoint will return an object containing the return code of the subprocess in response["return_code"], and any outputs will be placed in the output bucket at the output prefix.

Patching an Existing Container

To patch an existing container image in a registry see the example in tests/utils.py. First you will need to get the original cmd and entrypoint using get_new_env_vars and get_image_config. Then you can add the binary, set the new cmd, entrypoint, and environment variables with mutate_image.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sagemaker-shim-0.0.7.tar.gz (15.5 kB view hashes)

Uploaded Source

Built Distribution

sagemaker_shim-0.0.7-py3-none-any.whl (16.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page