Skip to main content

Small library to run data related scripts

Project description

data-tasks

Small library to run data related scripts

Setting up Your data-tasks Development Environment

First you'll need to install:

  • Git. On Ubuntu: sudo apt install git, on macOS: brew install git.
  • GNU Make. This is probably already installed, run make --version to check.
  • pyenv. Follow the instructions in pyenv's README to install it. The Homebrew method works best on macOS. The Basic GitHub Checkout method works best on Ubuntu. You don't need to set up pyenv's shell integration ("shims"), you can use pyenv without shims.

Then to set up your development environment:

git clone https://github.com/hypothesis/data-tasks.git
cd data-tasks
make help

Releasing a New Version of the Project

  1. First, to get PyPI publishing working you need to go to: https://github.com/organizations/hypothesis/settings/secrets/actions/PYPI_TOKEN and add data-tasks to the PYPI_TOKEN secret's selected repositories.

  2. Now that the data-tasks project has access to the PYPI_TOKEN secret you can release a new version by just creating a new GitHub release. Publishing a new GitHub release will automatically trigger a GitHub Actions workflow that will build the new version of your Python package and upload it to https://pypi.org/project/data-tasks.

Changing the Project's Python Versions

To change what versions of Python the project uses:

  1. Change the Python versions in the cookiecutter.json file. For example:

    "python_versions": "3.10.4, 3.9.12",
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

Changing the Project's Python Dependencies

To change the production dependencies in the setup.cfg file:

  1. Change the dependencies in the .cookiecutter/includes/setuptools/install_requires file. If this file doesn't exist yet create it and add some dependencies to it. For example:

    pyramid
    sqlalchemy
    celery
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

To change the project's formatting, linting and test dependencies:

  1. Change the dependencies in the .cookiecutter/includes/tox/deps file. If this file doesn't exist yet create it and add some dependencies to it. Use tox's factor-conditional settings to limit which environment(s) each dependency is used in. For example:

    lint: flake8,
    format: autopep8,
    lint,tests: pytest-faker,
    
  2. Re-run the cookiecutter template:

    make template
    
  3. Commit everything to git and send a pull request

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data-tasks-0.0.1.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

data_tasks-0.0.1-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file data-tasks-0.0.1.tar.gz.

File metadata

  • Download URL: data-tasks-0.0.1.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for data-tasks-0.0.1.tar.gz
Algorithm Hash digest
SHA256 30a468265f71520be37281816a768ab17fe3df6de050c300b65b7e795c457647
MD5 3548f69004786abb5590301a97ae941d
BLAKE2b-256 3c48db666a97e744cad4933c19c4863ae05268d78e052a9cec8df65ef07c4be4

See more details on using hashes here.

File details

Details for the file data_tasks-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: data_tasks-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 3.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for data_tasks-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0ec2fcfc7e72fc76f2ea9052386b227accffa14ada2276554de4f2c20d48a2b1
MD5 6bab6384f759598c0b00433fb81b116f
BLAKE2b-256 68a6786014b7b695724f2b90db2b076aaa1c166632dd748c5d38ff21ad851a63

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page