Skip to main content

Easy pipelines for pandas.

Project description

PyPI-Status PePy stats PyPI-Versions Build-Status Codecov Codefactor code quality LICENCE

Website: https://pdpipe.readthedocs.io/en/latest/

Easy pipelines for pandas DataFrames (learn how!).

>>> df = pd.DataFrame(
        data=[[4, 165, 'USA'], [2, 180, 'UK'], [2, 170, 'Greece']],
        index=['Dana', 'Jane', 'Nick'],
        columns=['Medals', 'Height', 'Born']
    )
>>> import pdpipe as pdp
>>> pipeline = pdp.ColDrop('Medals').OneHotEncode('Born')
>>> pipeline(df)
            Height  Born_UK  Born_USA
    Dana     165        0         1
    Jane     180        1         0
    Nick     170        0         0

1 📚 Documentation

This is the repository of the pdpipe package, and this readme file is aimed to help potential contributors to the project.

To learn more about how to use pdpipe, either visit pdpipe’s homepage or read the getting started section.

2 🔩 Installation

Install pdpipe with:

pip install pdpipe

Some pipeline stages require scikit-learn; they will simply not be loaded if scikit-learn is not found on the system, and pdpipe will issue a warning. To use them you must also install scikit-learn.

Similarly, some pipeline stages require nltk; they will simply not be loaded if nltk is not found on your system, and pdpipe will issue a warning. To use them you must additionally install nltk.

3 🎁 Contributing

Package author and current maintainer is Shay Palachy (shay.palachy@gmail.com); You are more than welcome to approach him for help. Contributions are very welcomed, especially since this package is very much in its infancy and many other pipeline stages can be added.

🪛 Installing for development ————–=————–

Clone:

git clone git@github.com:pdpipe/pdpipe.git

Install in development mode with test dependencies:

cd pdpipe
pip install -e ".[test]"

3.1 ⚗️ Running the tests

To run the tests, use:

python -m pytest

Notice pytest runs are configured by the pytest.ini file. Read it to understand the exact pytest arguments used.

3.2 🔬 Adding tests

At the time of writing, pdpipe is maintained with a test coverage of 100%. Although challenging, I hope to maintain this status. If you add code to the package, please make sure you thoroughly test it. Codecov automatically reports changes in coverage on each PR, and so PR reducing test coverage will not be examined before that is fixed.

Tests reside under the tests directory in the root of the repository. Each module has a separate test folder, with each class - usually a pipeline stage - having a dedicated file (always starting with the string “test”) containing several tests (each a global function starting with the string “test”). Please adhere to this structure, and try to separate tests cases to different test functions; this allows us to quickly focus on problem areas and use cases. Thank you! :)

3.3 ⚙️ Configuration

pdpipe can be configured using both a configuration file - locaated at either $XDG_CONFIG_HOME/pdpipe/cfg.json or, if the XDG_CONFIG_HOME environment variable is not set, at ~/.pdpipe/cfg.json - and environment variables.

At the moment, these configuration options are only relevant for development. The available options are:

  • LOAD_STAGE_ATTRIBUTES - True by default. If set to False stage attributes, which enable the chainer construction pattern, e.g. pdp.ColDrop('b').Bin('f'), are not loaded. This is used for sensible documentation generation. Set with this "LOAD_STAGE_ATTRIBUTES": false in cfg.json, or with export PDPIPE__LOAD_STAGE_ATTRIBUTES=False for environment variable-driven configuration.

3.4 ✒️ Code style

pdpip code is written to adhere to the coding style dictated by flake8. Practically, this means that one of the jobs that runs on the project’s Travis for each commit and pull request checks for a successfull run of the flake8 CLI command in the repository’s root. Which means pull requests will be flagged red by the Travis bot if non-flake8-compliant code was added.

To solve this, please run flake8 on your code (whether through your text editor/IDE or using the command line) and fix all resulting errors. Thank you! :)

3.5 📓 Adding documentation

This project is documented using the numpy docstring conventions, which were chosen as they are perhaps the most widely-spread conventions that are both supported by common tools such as Sphinx and result in human-readable docstrings (in my personal opinion, of course). When documenting code you add to this project, please follow these conventions.

Additionally, if you update this README.rst file, use python setup.py checkdocs to validate it compiles.

3.6 📋 Adding doctests

Please notice that for pdoc3 - the Python package used to generate the html documentation files for pdpipe - to successfully include doctests in the generated documentation files, the whole doctest must be indented in relation to the opening multi-string indentation, like so:

class ApplyByCols(PdPipelineStage):
    """A pipeline stage applying an element-wise function to columns.

    Parameters
    ----------
    columns : str or list-like
        Names of columns on which to apply the given function.
    func : function
        The function to be applied to each element of the given columns.
    result_columns : str or list-like, default None
        The names of the new columns resulting from the mapping operation. Must
        be of the same length as columns. If None, behavior depends on the
        drop parameter: If drop is True, the name of the source column is used;
        otherwise, the name of the source column is used with the suffix
        '_app'.
    drop : bool, default True
        If set to True, source columns are dropped after being mapped.
    func_desc : str, default None
        A function description of the given function; e.g. 'normalizing revenue
        by company size'. A default description is used if None is given.


    Example
    -------
        >>> import pandas as pd; import pdpipe as pdp; import math;
        >>> data = [[3.2, "acd"], [7.2, "alk"], [12.1, "alk"]]
        >>> df = pd.DataFrame(data, [1,2,3], ["ph","lbl"])
        >>> round_ph = pdp.ApplyByCols("ph", math.ceil)
        >>> round_ph(df)
           ph  lbl
        1   4  acd
        2   8  alk
        3  13  alk
    """

4 💳 Credits

Created by Shay Palachy (shay.palachy@gmail.com).

4.1 🐞 Bugfixes & Documentation:

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pdpipe-0.2.6.tar.gz (721.4 kB view details)

Uploaded Source

Built Distribution

pdpipe-0.2.6-py3-none-any.whl (108.4 kB view details)

Uploaded Python 3

File details

Details for the file pdpipe-0.2.6.tar.gz.

File metadata

  • Download URL: pdpipe-0.2.6.tar.gz
  • Upload date:
  • Size: 721.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for pdpipe-0.2.6.tar.gz
Algorithm Hash digest
SHA256 5afc4658878356d1f910ddea3f02ef396fba2f2c2af88449e7274dbcfe3929d1
MD5 b2f3c5e25880ead375d8e43659ebfac2
BLAKE2b-256 4da36ca8d84eda9eba965d08dd60b75e56849d2f5953f29a06cbebaf912ef028

See more details on using hashes here.

File details

Details for the file pdpipe-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: pdpipe-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 108.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for pdpipe-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 d2d1c9603be790ef4c82ca594e767ce1bf67cf057b3e77ae692c1568387914b2
MD5 90b22bd5fd7012ec096d09bd9ce768e9
BLAKE2b-256 0eec78aedb68fc5bfdc720b187e56b0472f22fc522265484d19b60316e07c686

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page