Skip to main content

Deploy your AI/ML model to Amazon SageMaker for real-time inference using your own Docker container image.

Project description

inference-server

Deploy your AI/ML model to Amazon SageMaker for real-time inference using your own Docker container image.

:blue_book: Documentation: https://inference-server.readthedocs.io

Installing

python -m pip install inference-server

Developing

To setup a scratch/development virtual environment (under .venv/), first install Tox. Then run:

tox -e dev

The inference-server package is installed in editable mode inside the .venv/ environment.

Run tests by simply calling tox.

Install code quality Git hooks using pre-commit install --install-hooks.

Terms & Conditions

Copyright 2023 J.P. Morgan Chase & Co.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Contributing

See CONTRIBUTING.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inference-server-1.0.1.tar.gz (24.7 kB view details)

Uploaded Source

Built Distribution

inference_server-1.0.1-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file inference-server-1.0.1.tar.gz.

File metadata

  • Download URL: inference-server-1.0.1.tar.gz
  • Upload date:
  • Size: 24.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for inference-server-1.0.1.tar.gz
Algorithm Hash digest
SHA256 ec9f2f42ce6f5b02848cfca25148e49508336dd5ab014be5e5c93f3bf5d8715e
MD5 ba3b7c20f8e734d1a33dfc22eced0cd1
BLAKE2b-256 39b03b9c5e6adbbe5f613e477d5466a1912bd2575c4d1c58e40df6b384633974

See more details on using hashes here.

File details

Details for the file inference_server-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for inference_server-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 691d5a8ee9c73cba91c884e1f85eb4d1f6c7c168b3e79912e4ec6b051b52632b
MD5 718e99125cba6ffe77372252a319251b
BLAKE2b-256 fab869b87a6f5a320dcc251a3865343bae2cacf0a3c99b6bc2d82ba0578c278a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page