Skip to main content

Deploy your AI/ML model to Amazon SageMaker for Real-Time Inference and Batch Transform using your own Docker container image.

Project description

inference-server

Deploy your AI/ML model to Amazon SageMaker for Real-Time Inference and Batch Transform using your own Docker container image.

:blue_book: Documentation: https://inference-server.readthedocs.io

Installing

Using package managers like Pip or Poetry:

python -m pip install inference-server

Using package managers like Conda or Mamba:

conda install conda-forge::inference-server

See https://github.com/conda-forge/inference-server-feedstock for details.

Developing

To setup a scratch/development virtual environment (under .venv/), first install Tox. Then run:

tox -e dev

The inference-server package is installed in editable mode inside the .venv/ environment.

Run tests by simply calling tox.

Install code quality Git hooks using pre-commit install --install-hooks.

Terms & Conditions

Copyright 2023 J.P. Morgan Chase & Co.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Contributing

See CONTRIBUTING.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inference_server-1.3.1.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

inference_server-1.3.1-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file inference_server-1.3.1.tar.gz.

File metadata

  • Download URL: inference_server-1.3.1.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for inference_server-1.3.1.tar.gz
Algorithm Hash digest
SHA256 d4842ea8692d5396ac31a53ee3358f17485291767f9069ca2fcc3ffa250b60ba
MD5 d9fb39fbe91af165a020554c65bdb5c4
BLAKE2b-256 f1d04db3ff80b91fa7fefcbecc27bac75a2832479e45b89ea56f58e985b7cc49

See more details on using hashes here.

File details

Details for the file inference_server-1.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for inference_server-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e85e611ac29c44615a22abcdd08b2fc73d1dfe4659ee602c193e51d95045fd45
MD5 9adff7ba367651a7b51969bc756c037c
BLAKE2b-256 75bbd7fa5a45cd7032682705b2dd8541a6c2d757f0e918726a73cfefbc16d8dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page