Skip to main content

DataKitchen Utils Library

Project description

DKUtils

This python package is intended to house utility functions and classes that are used in DataKitchen recipes.

Building and testing this module is conveniently done using Make. Issue the make command to see a list of available targets (shown below for convenience). Note that any target can be suffixed with -ext to run that target inside a Docker container. This allows testing and development in a standard and portable environment. To develop inside a running docker container, use the bash-ext target. This will drop the user into a bash shell inside a running container.

Add '-ext' to any target to run it inside a docker container

Versioning:
    bump/major bump/minor bump/patch - bump the version

Utilities:
    bash         run bash - typically used in conjunction with -ext to enter a docker container
    scan_secrets scan source code for sensitive information

Linting:
    lint         run flake8 and yapf
    flake8       run flake8
    yapf         run yapf and correct issues in-place
    yapf-diff    run yapf and display diff between existing code and resolution if in-place is used

Testing:
    test         run all unit tests
    test_unit    run all unit tests
    clean_unit   remove files from last test run (e.g. report_dir, .coverage, etc.)
    tox          run unit tests in python 2 and 3
    clean_tox    clean tox files (e.g. .tox)

Documentation:
    docs         generate Sphinx documentation
    docs/html    generate Sphinx documentation
    docs/clean   remove generated Sphinx documentation

Build and Upload:
    build        generate distribution archives (i.e. *.tar.gz and *.whl)
    upload       upload distribution archives to PyPI
    clean_build  remove all the build files (i.e. build, dist, *.egg-info)

Cleanup:
    clean        run all the clean targets in one go
    clean_pyc    remove all *.pyc files

Pre-commit is also included to validate and flag commits that contain code that does not pass Flake8 and YAPF. To use, first install the python package pip install pre-commit and then run pre-commit install. All future commits will run these tools and deny commits that don't pass. When running YAPF, pre-commit will make in-place corrections to your code. Therefore, if it fails the YAPF validation on the first commit attempt, simply review the changed files, add, and commit again to resolve.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DKUtils-0.5.0.tar.gz (10.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

DKUtils-0.5.0-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file DKUtils-0.5.0.tar.gz.

File metadata

  • Download URL: DKUtils-0.5.0.tar.gz
  • Upload date:
  • Size: 10.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.9

File hashes

Hashes for DKUtils-0.5.0.tar.gz
Algorithm Hash digest
SHA256 f70de331f89bbac1e3f9ffe23a174f7ac8d784d3d9e7f8d0878db6ab7f8300a5
MD5 e21e07ae42b1e17258d87119cdfd928d
BLAKE2b-256 351b1d6a0be614ae8f41e1cf5120ee14783f8426e3c66b13ded4d1fbb13220f4

See more details on using hashes here.

File details

Details for the file DKUtils-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: DKUtils-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 10.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.9

File hashes

Hashes for DKUtils-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eba96ecdc33cf90d7f4067061f42dbd3794633b9e8d286c04f317b86e9b7c093
MD5 9c9bca899e2270c191e1360857effcc5
BLAKE2b-256 91a0d76cdbb92268e9ea8066ade6926bc488a9fc78b50fc453ff024b0cb1c45a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page