Get Planetary Data from the Planetary Data System (PDS)
Project description
LabCas Workflow
Run workflows for Labcas
Depending on what you do, there are multiple ways of running a labcase workflow:
- Developers: for developers: local run, natively running on your OS
- Integrators: for AWS Managed Apache Airflow integrators (mwaa), with a local mwaa
- System Administrators: for System administors, deployed/configured on AWS
- End users: For end users, using the AWS deployment.
Developers
The tasks of the workflow run independently from Airflow. TODO: integrate to the airflow python API.
Install
With python 3.11, preferably use a virtual environment
pip install -e '.[dev]'
Set AWS connection
./aws-login.darwin.amd64
export AWS_PROFILE=saml-pub
Run/Test the client
Without a dask cluster:
python src/labcas/workflow/manager/main.py
With a local dask cluster
Start the scheduler:
docker network create dask
docker run --network dask -p 8787:8787 -p 8786:8786 labcas/workflow scheduler
Start one worker
docker run --network dask -p 8786:8786 labcas/workflow worker
Start the client, same as in previous section but add the tcp://localhost:8787 argument to the dask client in the main.py script
Deploy package on pypi
Upgrade the version in file "src/labcas/workflow/VERSION.txt"
Publish the package on pypi:
pip install build
pip install twine
rm dist/*
python -m build
twine upload dist/*
Integrators
Build the Dask worker image
Update the labcas.workflow dependency version as needed in docker/Dockerfile, then:
docker build -f docker/Dockerfile . -t labcas/workflow
Create a managed AirFlow docker image to be run locally
Use repository https://github.com/aws/aws-mwaa-local-runner, clone it, then:
./mwaa-local-env build-image
Then from your local labcas_workflow repository:
cd mwaa
As needed, update requirements in requirements directory and dags in dags directory.
Update the AWS credentials
aws-login.darwin.amd64
cp -r ~/.aws .
Launch the services
docker compose -f docker-compose-local.yml up
Test the server on http://localhost:8080 , login admin/test
Stop
Ctrl^C
Stop and re-initialize local volumes
docker compose -f ./docker-compose-local.yml down -v
See the console on http://localhost:8080, admin/test
Test the requirement.txt files
./mwaa-local-env test-requirements
Debug the workflow import
docker container ls
Pick the container id of image "amazon/mwaa-local:2_10_3", for example '54706271b7fc':
Then open a bash interpreter in the docker container:
docker exec -it 54706271b7fc bash
And, in the bash prompt:
cd dags
python3 -c "import nebraska"
System administrators
The deployment requires:
- one ECS cluster for the dask cluster.
- Optionally, an EC2 instance client of the Dask cluster
- One managed Airflow
dask on ECS
Deploy the image created in the previous section on ECR
Have a s3 bucket labcas-infra for the terraform state.
Other pre-requisites are:
- a VPC
- subnets
- a security group allowing incoming request whre the client runs, at JPL, on EC2 or Airflow, to port 8786 and port 8787
- a task role allowing to write on CloudWatch
- a task execution role which pull image from ECR and standard ECS task Excecution role policy "AmazonECSTaskExecutionRolePolicy"
Deploy the ECS cluster with the following terraform command:
cd terraform
terraform init
terraform apply \
-var consortium="edrn" \
-var venue="dev" \
-var aws_fg_image=<uri of the docker image deployed on ECR>
-var aws_fg_subnets=<private subnets of the AWS account> \
-var aws_fg_vpc=<vpc of the AWS account> \
-var aws_fg_security_groups <security group> \
-var ecs_task_role <arn of a task role>
-var ecs_task_execution_role <arn of task execution role>
Test the dask cluster
Connect to an EC2 instance, client of the Dask cluster
ssh {ip of the EC2 instance}
aws-login
export AWS_PROFILE=saml-pub
git clone {this repository}
cd workflows
source venv/bin/activate
python src/labcas/workflow/manager/main.py
To See Dask Dashboard, open SSH tunnels:
ssh -L 8787:{dask scheduler ip on ECS}:8787 {username}@{ec2 instance ip}
ssh -L 8787:{dask scheduler ip on ECS}:8787 {username}@{ec2 instance ip}
in browser: http://localhost:8787
Apache Airflow
An AWS managed Airflow is deployed in version 2.10.3.
The managed Airflow is authorized to read and write in the data bucket.
The managed Airflow is authorized to access the ECS security group.
It uses s3 bucket {labcas_airflow}.
Upload to S3 the ./mwaa/requirements/requirements.txt file to the bucket in: s3:/{labas_airflow}/requirements/
Upload to S3 the ./mwaa/dags/nebraska.py file to the bucket in: s3:/{labas_airflow}/dags/
Update the version of the requirements.txt file in the Airflow configuration console.
Test, go the the Airflow web console, and trigger the nebraska dag.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file labcas_workflow-0.1.13.tar.gz.
File metadata
- Download URL: labcas_workflow-0.1.13.tar.gz
- Upload date:
- Size: 5.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
03fd9852ebe4db42e535fb8cb799c07e97fd8263f3e46342f1c1b5cd8a9b0410
|
|
| MD5 |
c223d05449031220fcf6891234a922af
|
|
| BLAKE2b-256 |
e6840b6658c0e4d33e4a77be0c0c7fa5552c45be809d0593b8ecd44582e7b7ba
|
File details
Details for the file labcas_workflow-0.1.13-py3-none-any.whl.
File metadata
- Download URL: labcas_workflow-0.1.13-py3-none-any.whl
- Upload date:
- Size: 5.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1c0c87bb47f065b3172a15bf18b7b659f9ff6d46a8dd09a4d9196a8b77d60c47
|
|
| MD5 |
dfe0653aaa12fe187c0b2f52dcb549c8
|
|
| BLAKE2b-256 |
ddddbc7507358de45eb516542089fc8f7fdb505fb7604dd2f1f8e6cadd2357e6
|