Minimalistic & easy deployment of PyTorch models on AWS Lambda with C++
Project description
torchlambda is a tool to deploy PyTorch models on Amazon's AWS Lambda using AWS SDK for C++ and custom C++ runtime.
Using statically compiled dependencies whole package is shrunk to only 30MB
.
Due to small size of compiled source code users can pass their models as AWS Lambda layers. Services like Amazon S3 are no longer necessary to load your model.
torchlambda has it's PyTorch & AWS dependencies always up to date because of continuous deployment run at 03:00 a.m.
every day.
Docs | Deployment | Package | Python | PyTorch | Docker | CodeBeat | Images |
---|---|---|---|---|---|---|---|
Comparison with other deployment tools
- Do one thing and do it well - most deployment tools are complex solutions
including multiple frameworks and multiple services.
torchlambda
focuses solely on PyTorch and AWS Lambda integration. - Write programs to work together - Amazon AWS tools exist for a reason,
no need to repeat their functionalities (like
aws-cli
) or interfere with your PyTorch source code. - Small is beautiful -
3000
LOC where most of it are Python argument parsing from user makes it relatively easy to delve into source code and modify on your own. - Focus on lightweight inference - you can't train you neural network with
torchlambda
. There are other tools for that and should be used when appropriate. - Easy to jump in - no need to learn new tool.
torchlambda
has at most4
commands simplifying steps to take PyTorch model into the cloud. - Extensible when you need it - No additional source code needed (except YAML settings), but fully possible if you need it. You can also tune PyTorch and AWS SDK dependencies for your specific use-case
Table Of Contents
Footnotes
1. Support for latest version of it's main DL framework or main frameworks if multiple supported
2. Project dependencies are easily customizable. In torchlambda it would be user
specified build procedures for libtorch
and AWS C++ SDK
3. Necessary size of code and dependencies to deploy model
4. Based on Dockerfile size
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file torchlambda-1587188625.tar.gz
.
File metadata
- Download URL: torchlambda-1587188625.tar.gz
- Upload date:
- Size: 21.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 45e98241e21a7bfa2db04e1c3c3bfa070a7efa0191c2209dd46ebf511eefd325 |
|
MD5 | a706ee4addea87ecc377143f8da55e63 |
|
BLAKE2b-256 | 97d7be684b343bb63107ac9bc32a50c6173470c69acb7307076683b104710d2a |
File details
Details for the file torchlambda-1587188625-py3-none-any.whl
.
File metadata
- Download URL: torchlambda-1587188625-py3-none-any.whl
- Upload date:
- Size: 28.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7defe452d47d4d72cac1b34c276a8b2493dc7131ff0465108c5e2a826947192e |
|
MD5 | 5a0bda1035a41f807312af764c035b5c |
|
BLAKE2b-256 | ccb0485e84e9c4d71f81a8f3b65287befe73408ffaeef873bdd45b60176bbf03 |