Skip to main content

AWS Lambdas on FastAPI it is a Cli utilit that up & run your Python lambdas in local env based on AWS CloudFormation Template (json or yaml)

Project description

AWS Lambdas on FastAPI - LoF

badge1 badge2 badge3 workflow

AWS Lambdas on FastAPI (LoF) is a command line tool that helps you fast & easy up & run your Python AWS Lambdas for tests and local development.

Pay attention, that this is only for Python lambdas.

It does not support any other programming languages.

How does it work?

Install

pip install lof

Now run lof & provide to it path to your template yaml/json file. Or you can run it from source dir with template.yaml (/.json) without any args

How to use

lof

# or if path custom

lof --template example/template.yaml

You can choose that lambdas exclude from run by passing their names:

lof --template example/template.yaml --exclude=PostLambda2Function

To pass environment variables to Lambdas, use flag –env, you can pass variables in 2 formats - json format and ‘.env’ format. Both files as examples presented in example/ folder

lof --env=.env

# or

lof --env=vars.json

Autorizer Lambda

To emulate behaviour of Authorizer lambda use flag –proxy-lambdas, where - LambdaAuthorizer must be changed to your lambda name from cloud formation template. Request will got through this proxy lambdas and only if everything ok it will call target lambda.

Same as in API GAteway - all return values from proxy lambdas will update “requestContext” key in the event.

lof --proxy-lambdas=LambdaAuthorizer

# or

lof --proxy-lambdas=LambdaAuthorizer,CORS # if you need to or more proxy lambdas

Other settings

  Usage: lof [OPTIONS]

  Options:
--template TEXT                 Path to AWS Code Deploy template with
                                lambdas  [default: template.yaml]

--env TEXT                      Path to file with environment variables
--exclude TEXT                  Exclude lambdas.FastAPI will not up & run
                                them. Pass as string with comma. Example:
                                PostTrafficHook,PretrafficHook.  [default: ]

--port INTEGER                  Port to run lof  [default: 8000]
--host TEXT                     Host to run lof  [default: 0.0.0.0]
--proxy-lambdas TEXT            Lambdas Names that must be used as Handlers
                                for request. For example, Authorizer Lambda
                                or CORS Lambds. Each time when you send
                                request to some lambda - it will go through
                                those lambdas and populate 'requestContext'
                                in the event  [default: ]

--workers INTEGER               Count of unicorn workers to run.If you want
                                run more when 1 worker LoF will generate
                                temp FastAPI server code for your lambdas.
                                [default: 1]

--debug / --no-debug            Debug flag for Uvicorn  [default: True]
--reload / --no-reload          Reload flag for Uvicorn  [default: False]
--help                          Show this message and exit.

This mean, that lof will up & run all lambdas exclude this 2: PostTrafficHook & Roles

Demo

will be added soon

Example

To try how LoF works you can use AWS CloudFormation template.yaml & Lambdas from example/ folder.

Issues & features request

Fill free to open Issues & report bugs. I will solve them as soon as possible. If you have any sugesstions or feature request - also fell free to open the issue.

Problem Context

On my current project I works a lot with AWS Lambdas & tries to up & run them with SAM local. And there is some issues especially when you work on the project with a big count of lambdas.

Some of them:

  1. First of all it does not allow skip some lambdas form config

  2. It build lambdas inside each docker container so it takes significant time to build/rebuild & up all containers (and you need up all containers if you want to have fast integration tests)

Both points in the mix make impossible to use SAM in weak developers envs like VDI, for example.

Changelog

v0.5.5

  1. One more fix autorizer response population in event context

v0.5.4

  1. Fix autorizer response population in event context

v0.5.3

  1. Fixed bunch of issues relative to missed pathParams in event & wrong authorizer context providing

v0.5.2

  1. Bug fixies.

v0.5.0

  1. Added AWS Context object. Part of values filed right now with mock values, values from config will be added in next versions.

v0.4.1 Features:

  1. Added option –proxy-lambdas where you can pass a list of lambdas that will be used as middleware request handlers. For example, as Authorization Lambda or CORS lambda.

Based on order in that you provided lambdas names in –proxy-lambdas option request will be send through them and populates same as on aws in “requestContext” field of the event. For example, if you use –proxy-lambdas=CORS,Authorizer this mean request first of all will go to CORS lambda and if all ok (no raise errors) will got to Authorizer lambda and when to target lambda (endpoint that you call).

Fixes:

  1. Paths with symbols ‘-.’ now does not cause issue during running with 1 and more worker.

v0.3.0

  1. Added Possimility to run multiple workers with flag –workers. This helpful if you need speed up your local server or some lambdas need to call another lambdas directly.

  2. Added flag –reload to cli if you want auto reload server when code changed (uvicor –reload)

  3. Added support for Cloud Formation templates in JSON

v0.2.3

  1. Possibility to send port & host to start several instances in same time.

v0.2.2

  1. README.md is updated

  2. Fixed Issue with lambdas in template, that does not have Events with Path (like S3 triggered lambdas)

  3. Fixed issue with status code 204 - now it returns correct answer with no failes.

  4. Added some tests

v0.2.1

  1. Now LoF do not try/except lambdas errors

v0.2.0

  1. Fixed status_code resend from lambda & JSON body response

v0.1.0

  1. First version of Lambdas on FastApi. Based on AWS CloudFormation template it’s serve lambdas as FastAPI endpoints for local testing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lof-0.5.5.tar.gz (13.1 kB view details)

Uploaded Source

Built Distribution

lof-0.5.5-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file lof-0.5.5.tar.gz.

File metadata

  • Download URL: lof-0.5.5.tar.gz
  • Upload date:
  • Size: 13.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.11 Darwin/19.6.0

File hashes

Hashes for lof-0.5.5.tar.gz
Algorithm Hash digest
SHA256 ca77ae0f6941710d3d5fc09c5a49bb66406a44a0d49f7e1ccb3d2c9646e5a8e7
MD5 1e9d8ab3abc2ef97063ac443394d1974
BLAKE2b-256 5b104dc511206c5df244e0ac88ec3ed9d274467ab5bac02631f11391dedc1738

See more details on using hashes here.

File details

Details for the file lof-0.5.5-py3-none-any.whl.

File metadata

  • Download URL: lof-0.5.5-py3-none-any.whl
  • Upload date:
  • Size: 13.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.11 Darwin/19.6.0

File hashes

Hashes for lof-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 a65c83b501dde7c9650a21a7b8a6b2a224c4215aaf6934229d5d22ad6b1781b7
MD5 f4dcb5d9793f17824c159158748d6208
BLAKE2b-256 5b25463395eee2be7839a63655e328003f5adf79bc347b3cb40344852fc72651

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page