Get Planetary Data from the Planetary Data System (PDS)
Project description
LabCas Workflow
Run workflows for Labcas
Install
locally
Preferably use a virtual environment with python 3.9
pip install -e '.[dev]'
With Dask on docker
Create certificates:
cd docker/certs
./generate-certs.sh
Build the docker image:
docker build -f docker/Dockerfile . -t labcas/workflow
Start the scheduler:
docker network create dask
docker run --network dask -p 8787:8787 -p 8786:8786 labcas/workflow scheduler
Start one worker
docker run --network dask -p 8786:8786 labcas/workflow worker
Start the client, same as in following section
With dask on ECS
Deploy the image created in the previous section on ECR
Have a s3 bucket labcas-infra for the terraform state.
Other pre-requisites are:
- a VPC
- subnets
- a security group allowing incoming request whre the client runs, at JPL, on EC2 or Airflow, to port 8786 and port 8787
- a task role allowing to write on CloudWatch
- a task execution role which pull image from ECR and standard ECS task Excecution role policy "AmazonECSTaskExecutionRolePolicy"
Deploy the ECS cluster with the following terraform command:
cd terraform
terraform init
terraform apply \
-var consortium="edrn" \
-var venue="dev" \
-var aws_fg_image=<uri of the docker image deployed on ECR>
-var aws_fg_subnets=<private subnets of the AWS account> \
-var aws_fg_vpc=<vpc of the AWS account> \
-var aws_fg_security_groups <security group> \
-var ecs_task_role <arn of a task role>
-var ecs_task_execution_role <arn of task execution role>
Run
Set you local AWS credentials to access the data
./aws-login.darwin.amd64
Start the dask cluster
Run the processing
python ./src/labcas/workflow/manager/main.py
Publish the package on pypi
pip install build
pip install twine
python -m build
twine upload dist/*
Apache Airflow
Test locally using https://github.com/aws/aws-mwaa-local-runner
Follow the README instructions.
cd mwaa
Launch the server
./mwaa-local-env start
See the console on http://localhost:8080, admin/test
Test the requirement.txt files
./mwaa-local-env test-requirements
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file labcas_workflow-0.1.3.tar.gz.
File metadata
- Download URL: labcas_workflow-0.1.3.tar.gz
- Upload date:
- Size: 4.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c2f723fef0cea08e0859bd01897e69b51ee444351e86f9c14e5bbe2153929e1e
|
|
| MD5 |
505ce437293cd4f7e6e01660fb2b4c41
|
|
| BLAKE2b-256 |
13de535002c9effc2d9a03b9e727efaf5d197e8dcd2643d4b35f77361e91b46e
|
File details
Details for the file labcas_workflow-0.1.3-py3-none-any.whl.
File metadata
- Download URL: labcas_workflow-0.1.3-py3-none-any.whl
- Upload date:
- Size: 4.9 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8ae9d6fb0a08bbd9684c32a7a677b90e25142e11aa4363ec45c729cb7350ae44
|
|
| MD5 |
389a8fc62ade216721f48d16506b85fe
|
|
| BLAKE2b-256 |
59b48ad67e9358d7918995e8577534ddb1aa2647dfcc15359b5e766dbcea576b
|