Skip to main content

Workflow Templates for Reproducible Data Analysis Benchmarks

Project description

https://api.travis-ci.org/scailfin/benchmark-templates.svg?branch=master https://codecov.io/gh/scailfin/benchmark-templates/branch/master/graph/badge.svg https://img.shields.io/badge/License-MIT-yellow.svg

About

Workflow Templates are parameterized workflow specifications for the Reproducible Open Benchmarks for Data Analysis Platform (ROB). Workflow templates are motivated by the goal to allow users to run pre-defined data analytics workflows while providing their own input data, parameters, as well as their own code modules. Workflow templates are inspired by, but not limited to, workflow specifications for the Reproducible Research Data Analysis Platform (REANA).

More Information

The Workflow Templates Section provides further information about templates and their syntax. Benchmark Templates are an extension of the base templates. These templates are used by the Reproducible Benchmark Engine to run benchmark workflows and maintain benchmark results.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for benchmark-templates, version 0.1.2
Filename, size File type Python version Upload date Hashes
Filename, size benchmark_templates-0.1.2-py3-none-any.whl (46.1 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size benchmark-templates-0.1.2.tar.gz (43.2 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page