Skip to main content

Get Planetary Data from the Planetary Data System (PDS)

Project description

LabCas Workflow

Run workflows for Labcas

Depending on what you do, there are multiple ways of running a labcase workflow:

  • Developers: for developers: local run, natively running on your OS
  • Integrators: for AWS Managed Apache Airflow integrators (mwaa), with a local mwaa
  • System Administrators: for System administors, deployed/configured on AWS
  • End users: For end users, using the AWS deployment.

Developers

The tasks of the workflow run independently from Airflow. TODO: integrate to the airflow python API.

Install

With python 3.11, preferably use a virtual environment

pip install -e '.[dev]'

Set AWS connection

./aws-login.darwin.amd64
export AWS_PROFILE=saml-pub

Run/Test the client

python src/labcas/workflow/manager/main.py

Deploy package on pypi

Upgrade the version in file "src/labcas/workflow/VERSION.txt"

Publish the package on pypi:

pip install build
pip install twine
rm dist/*
python -m build
twine upload dist/*

Integrators

Build the Dask worker image

Update the labcas.workflow dependency version as needed in docker/Dockerfile, then:

docker build -f docker/Dockerfile . -t labcas/workflow

Create a managed AirFlow docker image to be run locally

Use repository https://github.com/aws/aws-mwaa-local-runner, clone it, then:

./mwaa-local-env build-image

Then from your local labcas_workflow repository:

cd mwaa

As needed, update requirements in requirements directory and dags in dags directory.

Update the AWS credentials

aws-login.darwin.amd64
cp -r ~/.aws .

Launch the server

docker compose -f docker-compose-local.yml up

Test the server on http://localhost:8080 , login admin/test

Stop

Ctrl^C

Stop and re-initialize local volumes

docker compose  -f ./docker/docker-compose-local.yml down -v

See the console on http://localhost:8080, admin/test

Test the requirement.txt files

./mwaa-local-env test-requirements

Debug the workflow import

docker container ls

Pick the container id of image "amazon/mwaa-local:2_10_3", for example '54706271b7fc':

Then open a bash interpreter in the docker container:

docker exec -it 54706271b7fc bash

And, in the bash prompt:

cd dags
python3 -c "import nebraska"

Start the scheduler:

docker network create dask
docker run --network dask -p 8787:8787 -p 8786:8786 labcas/workflow scheduler

Start one worker

docker run  --network dask -p 8786:8786 labcas/workflow worker 

Start the client, same as in following section

With dask on ECS

Deploy the image created in the previous section on ECR

Have a s3 bucket labcas-infra for the terraform state.

Other pre-requisites are:

  • a VPC
  • subnets
  • a security group allowing incoming request whre the client runs, at JPL, on EC2 or Airflow, to port 8786 and port 8787
  • a task role allowing to write on CloudWatch
  • a task execution role which pull image from ECR and standard ECS task Excecution role policy "AmazonECSTaskExecutionRolePolicy"

Deploy the ECS cluster with the following terraform command:

cd terraform
terraform init
terraform apply \
    -var consortium="edrn" \
    -var venue="dev" \
    -var aws_fg_image=<uri of the docker image deployed on ECR>
    -var aws_fg_subnets=<private subnets of the AWS account> \
    -var aws_fg_vpc=<vpc of the AWS account> \
    -var aws_fg_security_groups  <security group> \
    -var ecs_task_role <arn of a task role>
    -var ecs_task_execution_role <arn of task execution role>

Run

Set you local AWS credentials to access the data

./aws-login.darwin.amd64
export AWS_PROFILE=saml-pub

Start the dask cluster

Run the processing

python ./src/labcas/workflow/manager/main.py

Publish the package on pypi

pip install build
pip install twine
python -m build
twine upload dist/*

Apache Airflow

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

labcas_workflow-0.1.9.tar.gz (560.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

labcas_workflow-0.1.9-py3-none-any.whl (558.6 kB view details)

Uploaded Python 3

File details

Details for the file labcas_workflow-0.1.9.tar.gz.

File metadata

  • Download URL: labcas_workflow-0.1.9.tar.gz
  • Upload date:
  • Size: 560.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.4

File hashes

Hashes for labcas_workflow-0.1.9.tar.gz
Algorithm Hash digest
SHA256 6ffde4b1a7fb3f38893d4d5c892bf4971a431392fa454ac5c0427a241a8f07f8
MD5 734cf2cb8b564e8316b17821c3b5ca59
BLAKE2b-256 77d09788189ab17eefc82724fd420a8433396a372496777dc9f2bf30ccb0c58c

See more details on using hashes here.

File details

Details for the file labcas_workflow-0.1.9-py3-none-any.whl.

File metadata

File hashes

Hashes for labcas_workflow-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 48be697620526d4336deec2cfcdd9a99201fcc5e055aee3a6b1fc3765dc843d6
MD5 ac951301f55cd915628580afdeb85afe
BLAKE2b-256 694507d463829e07e9fc7fcc6366cb82a73a333d73518ad93c904d02318cecec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page