Deploy your AI/ML model to Amazon SageMaker for real-time inference using your own Docker container image.
Project description
inference-server
Deploy your AI/ML model to Amazon SageMaker for real-time inference using your own Docker container image.
:blue_book: Documentation: https://inference-server.readthedocs.io
Installing
python -m pip install inference-server
Developing
To setup a scratch/development virtual environment (under .venv/
), first install Tox.
Then run:
tox -e dev
The inference-server
package is installed in editable mode inside the .venv/
environment.
Run tests by simply calling tox
.
Install code quality Git hooks using pre-commit install --install-hooks
.
Terms & Conditions
Copyright 2023 J.P. Morgan Chase & Co.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
Contributing
See CONTRIBUTING.md
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file inference-server-1.0.6.tar.gz
.
File metadata
- Download URL: inference-server-1.0.6.tar.gz
- Upload date:
- Size: 27.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b71f21041a053232ed67bfdb54136b53f6c5122ed203c8d1aee6f06c0dd537ba |
|
MD5 | e62c98e6369770f70c2c8acbc4484104 |
|
BLAKE2b-256 | ebbd5eb49761d5e540d1835c78df151ae681f7fe5994c550ab229a19aac3029d |
File details
Details for the file inference_server-1.0.6-py3-none-any.whl
.
File metadata
- Download URL: inference_server-1.0.6-py3-none-any.whl
- Upload date:
- Size: 17.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 63a61cd330809cc86a89a45530cd7784efe0cce31a0cb923439ac57df8bb0b1b |
|
MD5 | ed32ca1e38df8631d9a318ec1a4bffe6 |
|
BLAKE2b-256 | 511a346e652f35cd1d71108db7c004752f4ee4db16695d97a2b9c7420c3b68ac |