Skip to main content

sapporo-service is a standard implementation conforming to the Global Alliance for Genomics and Health (GA4GH) Workflow Execution Service (WES) API specification.

Project description


pytest flake8 isort mypy Apache License

sapporo-service logo

Japanese Document

sapporo-service is a standard implementation conforming to the Global Alliance for Genomics and Health (GA4GH) Workflow Execution Service (WES) API specification.

One of sapporo-service's features is the abstraction of workflow engines, which makes it easy to convert various workflow engines into WES. Currently, the following workflow engines have been confirmed to work.

Another feature of sapporo-service is the mode that can only execute workflows registered by the system administrator. This feature is useful when building a WES in a shared HPC environment.

Install and Run

sapporo-service supports Python 3.6 or newer.

$ pip3 install sapporo
$ sapporo


You can also launch sapporo with Docker. In order to use Docker-in-Docker (DinD), you have to mount docker.sock, /tmp, etc.

# Launch
$ docker-compose up -d

# Launch confirmation
$ docker-compose logs


The help for the sapporo-service startup command is as follows.

$ sapporo --help
usage: sapporo [-h] [--host] [-p] [--debug] [-r] [--disable-get-runs]
               [--run-only-registered-workflows] [--service-info]
               [--executable-workflows] [--run-sh] [--url-prefix]

Implementation of a GA4GH workflow execution service that can easily support
various workflow runners.

optional arguments:
  -h, --help            show this help message and exit
  --host                Host address of Flask. (default:
  -p , --port           Port of Flask. (default: 1122)
  --debug               Enable debug mode of Flask.
  -r , --run-dir        Specify the run dir. (default: ./run)
  --disable-get-runs    Disable endpoint of `GET /runs`.
                        Disable `workflow_attachment` on endpoint `Post
                        Run only registered workflows. Check the registered
                        workflows using `GET /service-info`, and specify
                        `workflow_name` in the `POST /run`.
  --service-info        Specify `service-info.json`. The
                        supported_wes_versions, system_state_counts and
                        workflows are overwritten in the application.
                        Specify `executable-workflows.json`.
  --run-sh              Specify ``.
  --url-prefix          Specify the prefix of the url (e.g. --url-prefix /foo
                        -> /foo/service-info).

Operating Mode

There are two startup modes in sapporo-service.

  • Standard WES mode (Default)
  • Execute only registered workflows mode

These are switched with the startup argument -run-only-registered-workflows. It can also be switched by giving True or False to the environment variable SAPPORO_ONLY_REGISTERED_WORKFLOWS. Startup arguments take priority over environment variables.

Standard WES mode

As API specifications, please check GitHub - GA4GH WES.

When using sapporo-service, It is different from the standard WES API specification, you must specify workflow_engine_name in the request parameter of POST /runs. I personally think this part is standard WES API specification's mistake, so I am sending a request to fix it.

Execute only registered workflows mode

As API specifications for the execute only registered workflows mode, please check SwaggerUI - sapporo WES.

Basically, it conforms to the standard WES API. The changes are as follows.

  • Executable workflows are returned by GET /service-info as executable_workflows.
  • Specify workflow_name instead of workflow_url in POST /runs.

The following is an example of requesting GET /service-info in the execute only registered workflows mode.

GET /service-info
  "auth_instructions_url": "",
  "contact_info_url": "",
  "default_workflow_engine_parameters": [],
  "executable_workflows": [
      "workflow_attachment": [],
      "workflow_name": "CWL_trimming_and_qc_remote",
      "workflow_type": "CWL",
      "workflow_type_version": "v1.0",
      "workflow_url": ""
      "workflow_attachment": [
          "file_name": "fastqc.cwl",
          "file_url": ""
          "file_name": "trimming_pe.cwl",
          "file_url": ""
      "workflow_name": "CWL_trimming_and_qc_local",
      "workflow_type": "CWL",
      "workflow_type_version": "v1.0",
      "workflow_url": ""
  "supported_filesystem_protocols": ["http", "https", "file", "s3"],
  "supported_wes_versions": ["sapporo-wes-1.0.0"],
  "system_state_counts": {},
  "tags": {
    "debug": true,
    "get_runs": true,
    "registered_only_mode": true,
    "wes_name": "sapporo",
    "workflow_attachment": true
  "workflow_engine_versions": {
    "cromwell": "55",
    "cwltool": "1.0.20191225192155",
    "ep3": "v1.0.0",
    "nextflow": "21.01.1-edge",
    "snakemake": "v5.32.0",
    "toil": "4.1.0"
  "workflow_type_versions": {
    "CWL": { "workflow_type_version": ["v1.0", "v1.1", "v1.1.0-dev1"] },
    "Nextflow": { "workflow_type_version": ["v1.0"] },
    "Snakemake": { "workflow_type_version": ["v1.0"] },
    "WDL": { "workflow_type_version": ["1.0"] }

The executable workflows are managed at executable_workflows.json. Also, the schema for this definition is executable_workflows.schema.json. The default location of these files is under the application directory of sapporo-service. You can override them by using the startup argument --executable-workflows or the environment variable SAPPORO_EXECUTABLE_WORKFLOWS.

Run Dir

sapporo-service manages the submitted workflows, workflow parameters, output files, etc. on the file system. You can override the location of run dir by using the startup argument --run-dir or the environment variable SAPPORO_RUN_DIR.

The run dir structure is as follows. You can initialize and delete each run by physical deletion with rm.

$ tree run
└── 29
    └── 29109b85-7935-4e13-8773-9def402c7775
        ├── cmd.txt
        ├── end_time.txt
        ├── exe
        │   └── workflow_params.json
        ├── exit_code.txt
        ├── outputs
        │   ├── ERR034597_1.small.fq.trimmed.1P.fq
        │   ├── ERR034597_1.small.fq.trimmed.1U.fq
        │   ├── ERR034597_1.small.fq.trimmed.2P.fq
        │   ├── ERR034597_1.small.fq.trimmed.2U.fq
        │   ├── ERR034597_1.small_fastqc.html
        │   └── ERR034597_2.small_fastqc.html
        ├── outputs.json
        ├── run_request.json
        ├── start_time.txt
        ├── state.txt
        ├── stderr.log
        ├── stdout.log
        └── workflow_engine_params.txt
├── 2d
│   └── ...
└── 6b
    └── ...

The execution of POST /runs is very complex. Examples using curl are provided in GitHub - sapporo/tests/curl. Please use these as references.

We use to abstract the workflow engine. When POST /runs is called, sapporo-service fork the execution of after dumping the necessary files to run dir. Therefore, you can apply various workflow engines to WES by editing

The default position of is under the application directory of sapporo-service. You can override it by using the startup argument --run-sh or the environment variable SAPPORO_RUN_SH.

Other Startup Arguments

You can change the host and port used by the application by using the startup arguments (--host and --port) or the environment variables SAPPORO_HOST and SAPPORO_PORT.

The following three startup arguments and environment variables are provided to limit the WES.

  • --disable-get-runs
    • SAPPORO_GET_RUNS: True or False.
    • Disable GET /runs.
      • When using WES with an unspecified number of people, by knowing the run_id, you can see the run's contents and cancel the run of other people.
      • Because run_id itself is automatically generated using uuid4, it is difficult to know it in brute force.
  • --disable-workflow-attachment
    • Disable workflow_attachment in POST /runs.
      • The workflow_attachment field is used to attach files for executing workflows.
      • There is a security concern because anything can be attached.
  • --url-prefix.
    • Set the URL PREFIX.
      • If --url-prefix /foo/bar is set, GET /service-info becomes GET /foo/bar/service-info.

The contents of the response of GET /service-info are managed in service-info.json. The default location of service-info.json is under the application directory of sapporo-service. You can override by using the startup argument --service-info or the environment variable SAPPORO_SERVICE_INFO.

Generate download link

sapporo-service provides the file and directory under run_dir as download link.

For details, please check /runs/{ run_id}/data/path-to-file-or-dir in SwaggerUI - sapporo WES for more information.


You can start the development environment as follows.

$ docker-compose -f up -d --build
$ docker-compose -f exec app bash

We use flake8, isort, and mypy as a linter.

$ bash ./tests/lint_and_style_check/
$ bash ./tests/lint_and_style_check/
$ bash ./tests/lint_and_style_check/

$ bash ./tests/lint_and_style_check/

We use pytest as a tester.

$ pytest .


Apache-2.0. See the LICENSE.


Please note that this repository is participating in a study into sustainability of open source projects. Data will be gathered about this repository for approximately the next 12 months, starting from 2021-06-16.

Data collected will include number of contributors, number of PRs, time taken to close/merge these PRs, and issues closed.

For more information, please visit our informational page or download our participant information sheet.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for sapporo, version 1.0.15
Filename, size File type Python version Upload date Hashes
Filename, size sapporo-1.0.15-py3-none-any.whl (31.2 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size sapporo-1.0.15.tar.gz (33.5 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page