No project description provided
Project description
OBJECTIVE
To facilitate the build of aws lambda projects in python. Building a directory structure, enabling deployment within the localstack, invoke the function inside the container, view functions logs and create zip files to use for building layers.
NOTE.: If you use Windows, you may want to use Docker or WSL to build layers and build dependency packages, because some dependencies have C binaries. And the binaries on Linux are different from Windows, and lambda runs on Linux.
REQUIREMENTS
If you want to use localstack to run your tests, you need to install the docker. And to use the deploy function into the localstack, you will need the serverless framework too, so you will also need to install the node.
HOW TO INSTALL
To install the application, just run the command below:
$ pip install pysls
HOW TO USE
Once the package is installed, you can run it via the command line. The commands are as follows:
CREATE FILE STRUCTURE
$ pysls --create_function=project_name
The files structure is as follows:
├── docker-compose.yml
├── lambda_test
│ ├── __init__.py
│ ├── src
│ │ ├── lambda_function.py
│ │ └── serverless.yml
│ └── tests
│ ├── integration
│ ├── unit
│ └── utils
│ ├── files
│ └── mocks
├── pyproject.toml
├── README.md
└── requirements.txt
docker-compose.yml
: Contains a pre-assembled localstack structure;lambda_test/src/lambda_function.py
: This file contains the main function code, and any other files must exist within the src folder;lambda_test/src/serverless.yml
: Contains serverless framework settings (the localstack plugin is already included);tests
: Folder reserved for your tests;pyproject.toml
: This file is for those who want to use poetry as package manager, but pysls also needs it to retrieve information;requirements.txt
: pysls uses this file to create a layer and to build a lambda function package to send to localstack.
OBS.: With the free version of localstack is not possible to use layers, but it is possible to send the code from the libraries together with the lambda code package.
ASSEMBLING THE LAYER ZIP FILE
$ pysls --create_layer=layer_name
This command will run the pip pointing to the folder ./python/lib/python+python_version/site-packages
as the final directory to place the library files, and after that it will compress the folder and delete it
SEND TO LOCALSTACK
$ pysls --deploy
This command will copy the src
folder to ./src_tmp
, and after that it will run a npm command to add the serverless-localstack
plugin. After that, it will add to the folder the libraries files that are listed in the file requirements.txt
. The script will execute the deploy command into the localstack based on the deploy command of serverless framework. After all this, the folder ./src_tmp
will be deleted.
OBS.: The localstack must be active, if not, run the commando:
$ docker-compose -up
.
VIEW LOGS INSIDE THE LOCALSTACK
$ pysls --logs
This command gets the settings inside the pysls_config.json
file, and sets up the function's logGroupName: /aws/lambda/<service_name>-dev-<function_name>
. With the full name, it is possible to view all logs related to the function.
CREATE A EVENT FILE
All lambda triggers send an event in json format. And AWS SAM is capable of generating these files for us, and with these files it is possible to go to their functions and test them. But I have implemented this function as well. I based all of my code on this part on SAM, and all of my event template files are a copy of the SAM repository.
$ pysls --generate_event --service=aws_service --event_type=service_event --params --filename=event_file_name.json
Example:
# I will create an event.json file in my root folder, for a put s3 event in the my_bucket bucket, and the file that will trigger this event is in the lambda_folder folder and is named data.csv
$ pysls --generate_event --service=s3 --event_type=put --bucket=my_bucket --key=lambda_folder/data.csv --filename=event.json
- All Services:
alexa-skills-kit
alexa-smart-home
apigateway
batch
cloudformation
cloudfront
codecommit
codepipeline
cognito
config
connect
dynamodb
cloudwatch
kinesis
lex
rekognition
s3
sagemaker
ses
sns
sqs
stepfunctions
If you execute the commands below, an event.json file will be created in your root folder, with the default parameters.
$ pysls --generate_event --service=s3 --event_type=put
$ pysls --generate_event --service=alexa-skills-kit --event_type=end-session
$ pysls --generate_event --service=dynamodb --event_type=update
If you are interessted in know all events type in all services and all possible params, see the file EVENT_INFO.md
EXECUTE THE FUNCTION BASED ON AN EVENT
$ pysls --invoke=event_file_path
This command gets the settings inside pysls_config.json
and with that it assembles the name that the function has inside the localstack <service_name>-dev-<function_name>
. And use the python SDK to invoke lambda by passing the event file as a parameter. And then it shows the lambda's response.
It is possible not to send any files, in this case run the command $ pysls --invoke
.
CONFIGURATIONS
Use the pysls_config.json
file to pass some settings. For now, there are only two: service
and function_name
. It is extremely important that these two names are the same as serverless.yml
-
pysls_config.json
{ "service": "test-lambda", "function_name": "test_lambda" }
-
serverless.yml
service: test-lambda functions: test_lambda: handler: lambda_function.lambda_handler
HOW TO CONTRIBUTE
- Open an issue with your idea to discuss;
- Then fork and send your pull request (please do not send too large pull requests).
FUTURE IDEAS
- Create your own settings file;
- Generate the event files by the tool itself;
- Do not depend on the Serveless Framework to build the function and its dependencies and send it to the localstack;
- Add new future ideas kkk
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.