Skip to main content

Minimalistic & easy deployment of PyTorch models on AWS Lambda with C++

Project description

torchlambda is a tool to deploy PyTorch models on Amazon's AWS Lambda using AWS SDK for C++ and custom C++ runtime.

Using statically compiled dependencies whole package is shrunk to only 30MB.

Due to small size of compiled source code users can pass their models as AWS Lambda layers. Services like Amazon S3 are no longer necessary to load your model.

torchlambda has it's PyTorch & AWS dependencies always up to date because of continuous deployment run at 03:00 a.m. every day.

Docs Deployment Package Python PyTorch Docker CodeBeat Images
Documentation CD PyPI Python PyTorch Docker codebeat badge Images

Comparison with other deployment tools

  • Do one thing and do it well - most deployment tools are complex solutions including multiple frameworks and multiple services. torchlambda focuses solely on PyTorch and AWS Lambda integration.
  • Write programs to work together - Amazon AWS & PyTorch exist for a reason, no need to repeat their functionalities (like aws-cli). No source code modifications of your neural networks required either.
  • Small is beautiful - 3000 LOC (most being convenience wrapper creating this tool) make it easy to delve into source code and modify what you want on your own.
  • Integration with other tools - as torchlambda focuses on narrow space you can use any tools you like with PyTorch (e.g. for training you can use KubeFlow or BentoML) and AWS (for example Terraform).
  • Easy to jump in - no need to learn new tool. torchlambda has at most 4 commands simplifying steps to take PyTorch model into the cloud which are mostly repetitive and possible to automate further.
  • Extensible when you need it - All you usually need are a few lines of YAML settings, but if you wish to fine-tune your deployment you can use torchlambda build --flags (changing various properties of PyTorch and AWS dependencies).

Table Of Contents

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchlambda-1590817644.tar.gz (21.4 kB view details)

Uploaded Source

Built Distribution

torchlambda-1590817644-py3-none-any.whl (28.9 kB view details)

Uploaded Python 3

File details

Details for the file torchlambda-1590817644.tar.gz.

File metadata

  • Download URL: torchlambda-1590817644.tar.gz
  • Upload date:
  • Size: 21.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.7.7

File hashes

Hashes for torchlambda-1590817644.tar.gz
Algorithm Hash digest
SHA256 d0c1a800e27322f714f923c0d4b7d0f75a234ca52e9f755a21b258182daa8d15
MD5 e40eefeabd3fee71405baa5f63c4a9a1
BLAKE2b-256 e5908a708557691a0b9c86d2f8dded0adf707f2f82dcad9ea8bf492cdc2d857e

See more details on using hashes here.

File details

Details for the file torchlambda-1590817644-py3-none-any.whl.

File metadata

  • Download URL: torchlambda-1590817644-py3-none-any.whl
  • Upload date:
  • Size: 28.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.7.7

File hashes

Hashes for torchlambda-1590817644-py3-none-any.whl
Algorithm Hash digest
SHA256 f8c0d5e227f2c8fbe3df2e5ec8253a19f29345f6736ed10708aefe04b1126afd
MD5 fe5d1e544863095680d5d768d09c1134
BLAKE2b-256 4f28ed08a767ba6e17f12c0d645c1a07908c2c8190fe976cd6460d0b4b6c9e32

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page