Skip to main content

Minimalistic & easy deployment of PyTorch models on AWS Lambda with C++

Project description

torchlambda is a tool to deploy PyTorch models on Amazon's AWS Lambda using AWS SDK for C++ and custom C++ runtime.

Using statically compiled dependencies whole package is shrunk to only 30MB.

Due to small size of compiled source code users can pass their models as AWS Lambda layers. Services like Amazon S3 are no longer necessary to load your model.

torchlambda has it's PyTorch & AWS dependencies always up to date because of continuous deployment run at 03:00 a.m. every day.

Docs Deployment Package Python PyTorch Docker CodeBeat Images
Documentation CD PyPI Python PyTorch Docker codebeat badge Images

Comparison with other deployment tools

  • Do one thing and do it well - most deployment tools are complex solutions including multiple frameworks and multiple services. torchlambda focuses solely on PyTorch and AWS Lambda integration.
  • Write programs to work together - Amazon AWS tools exist for a reason, no need to repeat their functionalities (like aws-cli) or interfere with your PyTorch source code.
  • Small is beautiful - 3000 LOC where most of it are Python argument parsing from user makes it relatively easy to delve into source code and modify on your own.
  • Focus on lightweight inference - you can't train you neural network with torchlambda. There are other tools for that and should be used when appropriate.
  • Easy to jump in - no need to learn new tool. torchlambda has at most 4 commands simplifying steps to take PyTorch model into the cloud.
  • Extensible when you need it - No additional source code needed (except YAML settings), but fully possible if you need it. You can also tune PyTorch and AWS SDK dependencies for your specific use-case

Table Of Contents

Footnotes

1. Support for latest version of it's main DL framework or main frameworks if multiple supported

2. Project dependencies are easily customizable. In torchlambda it would be user specified build procedures for libtorch and AWS C++ SDK

3. Necessary size of code and dependencies to deploy model

4. Based on Dockerfile size

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchlambda-1587188625.tar.gz (21.5 kB view hashes)

Uploaded Source

Built Distribution

torchlambda-1587188625-py3-none-any.whl (28.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page