Skip to main content

Python package for connecting services and building data pipelines

Project description

py-orca

PyPI-Server codecov Project generated with PyScaffold

Python package for connecting services and building data pipelines

This Python package provides the components to connect various third-party services such as Synapse, Nextflow Tower, and SevenBridges to build data pipelines using a workflow management system like Airflow.

Demonstration Script

This repository includes a demonstration script called demo.py, which showcases how you can use py-orca to launch and monitor your workflows on Nextflow Tower. Specifically, it illustrates how to process an RNA-seq dataset using a series of workflow runs, namely nf-synapse/synstage, nf-core/rnaseq, and nf-synapse/synindex. py-orca can be used with any Python-compatible workflow management system to orchestrate each step (e.g. Airflow, Prefect, Dagster). The demonstration script uses Metaflow because it's easy to run locally and has an intuitive syntax.

The script assumes that the following environment variables are set. Before setting them up, ensure that you have an AWS profile configured for a role that has access to the dev/ops tower workspace you plan to launch your workflows from. You can set these environment variables using whatever method you prefer (e.g. using an .env file, sourcing a shell script, etc). Refer to .env.example for the format of their values as well as examples.

  • NEXTFLOWTOWER_CONNECTION_URI
  • SYNAPSE_CONNECTION_URI
  • AWS_PROFILE (or another source of AWS credentials)

Once your environment variables are set, you can create a virtual environment, install the Python dependencies, and run the demonstration script (after downloading it) as follows. Note that you will need to update the s3_prefix parameter so that it points to an S3 bucket that is accessible to your Tower workspace.

Creating and setting up your py-orca virtual environment and executing demo.py

Below are the instructions for creating and setting up your virtual environment and executing the demo.py. You can also check the tutorial here. If you would like to set up a developer environment with the relevant dependencies, you can execute the shell script dev_setup in a clone of this repository stored on your machine. You can run it either on your local or on the EC2 instance. Establishing a development environment on an EC2 instance could encounter hurdles. You might need to install Python build dependencies before using pyenv to manage Python versions. You can refer to this doc to resolve the dependency issue. The openssl11-devel is not available on EC2: Linux Docker v1.3.9 so you can install openssl-devel instead. Moreover, you might run into missing GCC error, you can install GCC usng sudo yum install gcc.

# Create and activate a Python virtual environment (tested with Python 3.10)
python3 -m venv venv/
source venv/bin/activate

# Install Python dependencies
python3 -m pip install 'py-orca[all]' 'metaflow' 'pyyaml' 's3fs'

Before running the example below, ensure that the s3_prefix points to an S3 bucket your Nextflow dev or prod tower workspace has access to. In the example below, we will point to the example-dev-project-tower-scratch S3 bucket because we will be launching our workflows within the example-dev-project workspace in tower-dev. In this case, you can use either of the workflows-nextflow-dev profiles to access the S3 bucket.

# Run the script using an example dataset
python3 demo.py run --dataset_id 'syn51514585' --s3_prefix 's3://example-dev-project-tower-scratch/work'

Once your run takes off, you can follow the output logs in your terminal, or stay updated with your workflow progress on the web client. Be sure that your synstage workflow run has a unique name, and is not an iteration of a previous run (i.e. my_test_dataset_synstage_2, my_test_dataset_synstage_3, and so on). This is because the demo.py script does not currently support being able to locate the staged samplesheet file if it has been staged under a run name that is non-unique.

The above dataset ID (syn51514585) refers to the following YAML file, which should be accessible to Sage employees. Similarly, the samplesheet ID below (syn51514475) should also be accessible to Sage employees. However, there is no secure way to make the output folder accessible to Sage employees, so the synindex step will fail if you attempt to run this script using the example dataset ID. This should be sufficient to get a feel for using py-orca, but feel free to create your own dataset YAML file on Synapse with an output folder that you own.

id: my_test_dataset
samplesheet: syn51514475
output_folder: syn51514559

PyScaffold

This project has been set up using PyScaffold 4.3. For details and usage information on PyScaffold see https://pyscaffold.org/.

putup --name orca --markdown --github-actions --pre-commit --license Apache-2.0 py-orca

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

py_orca-1.4.0.tar.gz (70.7 kB view details)

Uploaded Source

Built Distribution

py_orca-1.4.0-py3-none-any.whl (40.8 kB view details)

Uploaded Python 3

File details

Details for the file py_orca-1.4.0.tar.gz.

File metadata

  • Download URL: py_orca-1.4.0.tar.gz
  • Upload date:
  • Size: 70.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for py_orca-1.4.0.tar.gz
Algorithm Hash digest
SHA256 8dd442e7360f4bb458bfef42bb174d5618e738bab4e814928c048e658d663922
MD5 4e5d9cd52a6ecd6b1503d3a66948a1b8
BLAKE2b-256 45aea2e4decd924ae237ffc25464f54edf73ef915380fc194ed38142d0851dc4

See more details on using hashes here.

File details

Details for the file py_orca-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: py_orca-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 40.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for py_orca-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a8b3285177864498b0c2bf2dbc97512c6d9d3ee8c5838931ff9181e6359977cd
MD5 092610408429bc55994fc8606e5fe82b
BLAKE2b-256 ce89dfef8f76f519e594e5367a41795cc85200d6808970640e820003e9fb1d4f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page