Skip to main content

Replaces BioWardrobe's backend with CWL Airflow

Project description

BioWardrobe backend (airflow+cwl)


Python package to replace BioWardrobe's python/cron scripts. It uses Apache-Airflow functionality with CWL v1.0.


  1. Add biowardrobe MySQL connection into Airflow connections
    select * from airflow.connection;
    insert into airflow.connection values(NULL,'biowardrobe','mysql','localhost','ems','wardrobe','',null,'{"cursor":"dictcursor"}',0,0);
  2. Install
    sudo pip3 install .


  1. Make sure your system satisfies the following criteria:

    • Ubuntu 16.04.3
      • python3.6
        sudo add-apt-repository ppa:jonathonf/python-3.6
        sudo apt-get update
        sudo apt-get install python3.6
      • pip3
        curl | sudo python3.6
        pip3 install --upgrade pip3
      • setuptools
        pip3 install setuptools
      • docker
        sudo apt-get update
        sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
        curl -fsSL | sudo apt-key add -
        sudo add-apt-repository "deb [arch=amd64] $(lsb_release -cs) stable"
        sudo apt-get update
        sudo apt-get install docker-ce
        sudo groupadd docker
        sudo usermod -aG docker $USER
        Log out and log back in so that your group membership is re-evaluated.
      • libmysqlclient-dev
        sudo apt-get install libmysqlclient-dev
      • nodejs
        sudo apt-get install nodejs
  2. Get the latest version of cwl-airflow-parser. If Apache-Airflow or cwltool aren't installed, installation will be done automatically with recommended versions. Set AIRFLOW_HOME environment variable to airflow config directory default is ~/airflow/.

    git clone
    cd cwl-airflow-parser
    sudo pip3 install .
  3. If required, add extra airflow packages for extending Airflow functionality, for instance, with MySQL support pip3 install apache-airflow[mysql].


  1. To create BioWardrobe's dags run biowardrobe-init in airflow's dags directory

    cd ~/airflow/dags
  2. Run Airflow scheduler:

    airflow scheduler
  3. Use airflow trigger_dag with input parameter --conf "JSON" where JSON is either job definition or biowardrobe_uid and explicitly specified cwl descriptor dag_id.

    airflow trigger_dag --conf "{\"job\":$(cat ./hg19.job)}" "bowtie-index"

    where hg19.job is:

      "fasta_input_file": {
        "class": "File", 
        "location": "file:///wardrobe/indices/bowtie/hg19/chrM.fa", 
        "size": 16909,
        "basename": "chrM.fa",
        "nameroot": "chrM",
        "nameext": ".fa"
      "output_folder": "/wardrobe/indices/bowtie/hg19/",
      "threads": 6,
      "genome": "hg19"
  4. All the output will be moved from temporary directory into output_folder parameter of the job.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for biowardrobe-airflow-analysis, version 1.0.20181214162558
Filename, size File type Python version Upload date Hashes
Filename, size biowardrobe-airflow-analysis-1.0.20181214162558.tar.gz (29.0 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page