Minimalistic & easy deployment of PyTorch models on AWS Lambda with C++
Project description
torchlambda is a tool to deploy PyTorch models on Amazon's AWS Lambda using AWS SDK for C++ and custom C++ runtime.
Using statically compiled dependencies whole package is shrunk to only 30MB.
Due to small size of compiled source code users can pass their models as AWS Lambda layers. Services like Amazon S3 are no longer necessary to load your model.
torchlambda has it's PyTorch & AWS dependencies always up to date because of continuous deployment run at 03:00 a.m.
every day.
| Docs | Deployment | Package | Python | PyTorch | Docker | CodeBeat | Images |
|---|---|---|---|---|---|---|---|
Comparison with other deployment tools
- Do one thing and do it well - most deployment tools are complex solutions
including multiple frameworks and multiple services.
torchlambdafocuses solely on PyTorch and AWS Lambda integration. - Write programs to work together - Amazon AWS & PyTorch exist for a reason,
no need to repeat their functionalities (like
aws-cli). No source code modifications of your neural networks required either. - Small is beautiful -
3000LOC (most being convenience wrapper creating this tool) make it easy to delve into source code and modify what you want on your own. - Integration with other tools - as
torchlambdafocuses on narrow space you can use any tools you like with PyTorch (e.g. for training you can use KubeFlow or BentoML) and AWS (for example Terraform). - Easy to jump in - no need to learn new tool.
torchlambdahas at most4commands simplifying steps to take PyTorch model into the cloud which are mostly repetitive and possible to automate further. - Extensible when you need it - All you usually need are a few lines of YAML settings, but if you wish to fine-tune your deployment you can use
torchlambda build--flags(changing various properties of PyTorch and AWS dependencies).
Table Of Contents
Footnotes
1. Support for latest version of it's main DL framework or main frameworks if multiple supported
2. Project dependencies are easily customizable. In torchlambda it would be user
specified build procedures for libtorch and AWS C++ SDK
3. Necessary size of code and dependencies to deploy model
4. Based on Dockerfile size
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchlambda-1587793412.tar.gz.
File metadata
- Download URL: torchlambda-1587793412.tar.gz
- Upload date:
- Size: 21.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c196c593329081bf64ad3c8110de9773affe6d3fed53ed583ee118a69b1e518e
|
|
| MD5 |
413f48204cc6af4e2a7a2b79c35119dd
|
|
| BLAKE2b-256 |
042e8f6df2c36b307ac73a95f00387b066a339b6af6d7d539b9de156bfebd689
|
File details
Details for the file torchlambda-1587793412-py3-none-any.whl.
File metadata
- Download URL: torchlambda-1587793412-py3-none-any.whl
- Upload date:
- Size: 29.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0d7161d634e9560f5746a7709c6f375341f0179945be8af6f731084888dc19e8
|
|
| MD5 |
5beba4b2d328f0133a3fc1c9307915e3
|
|
| BLAKE2b-256 |
89abd67bfedf03edfed88606ca397cd6a76fd96426c0b5952561cbff4937e1d7
|