Skip to main content

Build data science pipelines and models

Project description

Prodmodel

Prodmodel is a build system for data science pipelines. Users, testers, contributors are welcome!

Motivation · Concepts · Installation · Usage · Contributing · Contact · Licence

Motivation

  • Performance. No need to rerun things, everything is cached, switching between multiple versions is super easy. Prodmodel can figure out if a particular partial code path has already been executed using a particular piece of data and just use the cached output.
  • Easy debugging. Every single dependency - code or data - is version controlled and tracked.
  • Deploy to production. Models are more than just a file. Prodmodel makes sure that the correct version of label encoders, feature transformation code and data and model files are all packaged together.

Concepts

A build system is a DAG of rules (transformations), inputs and targets. In Prodmodel inputs can be

  • data,
  • Python code,
  • and configuration.

A rule is transforming any of the above to an output (which can in turn be depended on by other rules). Therefore rules need to be re-executed (and their outputs re-created) if any of their dependencies change. Prodmodel keeps track all of these dependencies.

The outputs of the rules are targets. Every target corresponds to an output (e.g. a model or a dataset). These outputs are cached and version controlled.

Prodmodel therefore ensures

  • correctness, by executing every code (e.g. feature transformation, model building, tests) which can potentially be affected by a change, and
  • performance, by executing only the necessary code, saving time compared to rerunning the whole pipeline.

Rules

Every rule is a statically typed function, where the inputs are targets, data, or configs. The execution of a rule outputs some data (e.g. a different feature set or a model), which can be used in other rules.

In order to use Prodmodel your code has to be structured as functions which the rules can call into.

Targets

Targets are created by rule functions. Targets can be executed to generate output files. IterableDataTarget is a special target which can be used as an iterable of dicts to make iterating over datasets easier. Regular DataTargets can represent any Python object.

Installation

Prodmodel requires at least Python3.6. Use pip to install prodmodel.

pip install prodmodel --user

Usage

Create a build.py file in your data science folder. The build file contains references to your inputs and the build rules you can execute.

from prodmodel.rules import rules

csv_data = rules.data_source(file='data.csv', type='csv', dtypes={...})

my_model = rules.transform(objects={'data': csv_data}, file='kmeans.py', fn='compute_kmeans')

Now you can build your model by running prodmodel my_model from the directory of build.py, or prodmodel <path_to_my_directory>:my_model from any directory.

Prodmodel creates a .prodmodel directory under the home directory of the user to store log and config files.

Documentation

Check out a complete example project for more examples.

The complete list of build rules can be found here.

Prodmodel searches for a config file under <user home dir>/.prodmodel/config. The config file can be created manually based on this template.

Arguments

  • --force_external: Some data sources are remote (e.g. an SQL server), therefore tracking changes is not always feasible. This argument gives the user manual control over when to reload these data sources.
  • --cache_data: Cache local data files if changed. This can be useful for debugging / reproducibility by making sure every data source used for a specific build is saved.
  • --output_format: One of none, str, bytes and log. The output format of the data produced by the build target written to stdout.

List targets in build file

  • Run prodmodel ls <path_to_build> to list targets in a build file where <path_to_build> to the build file or its directory.

Cleaning old cache files

  • Run prodmodel clean <target> --cutoff_date=<cutoff datetime> to delete output cache files of a target created before the cutoff datetime, which has to be in %Y-%m-%dT%H:%M%S (YYYY-mm-ddTHH:MM:SS) format.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Contact

Feel free to email me at gergely.svigruha@prodmodel.com if you have any question, need help or would like to contribute to the code.

Licence

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prodmodel-0.4.3.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

prodmodel-0.4.3-py3-none-any.whl (47.4 kB view details)

Uploaded Python 3

File details

Details for the file prodmodel-0.4.3.tar.gz.

File metadata

  • Download URL: prodmodel-0.4.3.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for prodmodel-0.4.3.tar.gz
Algorithm Hash digest
SHA256 4c25a169dc7ee9ccad2af39f46b0202b9a2951bc97a42970a3b164b2d0aebcae
MD5 4ae7f4ed6f815833a37dfc3678e5fba8
BLAKE2b-256 169c6d3fd6af589f3ca78e927bbf067b073df301c629281249aaa44cb4e06722

See more details on using hashes here.

File details

Details for the file prodmodel-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: prodmodel-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 47.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for prodmodel-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3d480150305390af5ed02105c0cd6fe5ef35d9375a2f0ab6ac92b5681c5d4fa0
MD5 77a413e5ea2c161830bae7acc6c3e01e
BLAKE2b-256 4fa497aa2b3226838d6255a83425238752083f9e84e6f699ec09388ad7980cf5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page