Skip to main content

Small library to run data related scripts

Project description

data-tasks

Small library to run data related scripts

Setting up Your data-tasks Development Environment

First you'll need to install:

  • Git. On Ubuntu: sudo apt install git, on macOS: brew install git.
  • GNU Make. This is probably already installed, run make --version to check.
  • pyenv. Follow the instructions in pyenv's README to install it. The Homebrew method works best on macOS. The Basic GitHub Checkout method works best on Ubuntu. You don't need to set up pyenv's shell integration ("shims"), you can use pyenv without shims.

Then to set up your development environment:

git clone https://github.com/hypothesis/data-tasks.git
cd data-tasks
make help

Releasing a New Version of the Project

  1. First, to get PyPI publishing working you need to go to: https://github.com/organizations/hypothesis/settings/secrets/actions/PYPI_TOKEN and add data-tasks to the PYPI_TOKEN secret's selected repositories.

  2. Now that the data-tasks project has access to the PYPI_TOKEN secret you can release a new version by just creating a new GitHub release. Publishing a new GitHub release will automatically trigger a GitHub Actions workflow that will build the new version of your Python package and upload it to https://pypi.org/project/data-tasks.

Changing the Project's Python Versions

To change what versions of Python the project uses:

  1. Change the Python versions in the cookiecutter.json file. For example:

    "python_versions": "3.10.4, 3.9.12",
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

Changing the Project's Python Dependencies

To change the production dependencies in the setup.cfg file:

  1. Change the dependencies in the .cookiecutter/includes/setuptools/install_requires file. If this file doesn't exist yet create it and add some dependencies to it. For example:

    pyramid
    sqlalchemy
    celery
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

To change the project's formatting, linting and test dependencies:

  1. Change the dependencies in the .cookiecutter/includes/tox/deps file. If this file doesn't exist yet create it and add some dependencies to it. Use tox's factor-conditional settings to limit which environment(s) each dependency is used in. For example:

    lint: flake8,
    format: autopep8,
    lint,tests: pytest-faker,
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data-tasks-0.0.2.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

data_tasks-0.0.2-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file data-tasks-0.0.2.tar.gz.

File metadata

  • Download URL: data-tasks-0.0.2.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for data-tasks-0.0.2.tar.gz
Algorithm Hash digest
SHA256 96049f247ff576e97083fdf95ae942ae501bb2229eb1ccbafa3727e4af64607b
MD5 d25d28f2fe8d49645b8af764cd29f858
BLAKE2b-256 9c29ce01fa489f258daaa36ebc7c79970e36c59dd7b428511f28a17ca9d87a88

See more details on using hashes here.

File details

Details for the file data_tasks-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: data_tasks-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for data_tasks-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4728f440217b14024cfd6aa97134b1248f3176be03f3ebee3cd769c1c25b3f49
MD5 ba331ec2f2de7075464283437923c1e0
BLAKE2b-256 78cdae4ea66a64664a06e67d435c2c618122b60cd09c9940d939bef5ff5bc40e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page