Skip to main content

A dashboard for visualising bidding data for the National Energy Market

Project description

UNSW CEEM Python Package Template

Replace the heading above with your_package.


Badges can go here

Code style: black


How do I use this template?

  1. Hit "Use this template" in the top right of this page
  2. Work through as much of the basic, intermediate and advanced steps as you like.
  3. Edit this README and make sure you update your_package, your_name and licence_type.

References

Nothing helps as much as examples.

  • This is a great guide that provides a brief overview of all the tools we use in this template.
  • All of the tooling has been implemented in nemseer

Usage

Basic

Updating repo info

  1. Choose a license, and add the LICENSE file to the repo
  2. Update your code of conduct
  3. Update the Get Started! section of the contributing guidelines
    • Note that this currently has steps you would use to install poetry v1.2.0 and various dependency groups that are being used by nemseer
  4. (Optional) Make your software citeable

Poetry

Poetry is used for dependency management, dependency resolution and can also be used as a build tool.

  1. Install poetry
    • Note that this repo is using poetry v1.2.0, so install this version (see the contributing guidelines)
    • As of August 2022, 1.2.0 is still pre-release, so make sure you are on the master version of the poetry documentation
    • Edit the project info in pyproject.toml, or delete it and use poetry init to start from scratch (if you are proceeding to the next few sections, it is best not to delete the existig pyproject.toml)
    • You can add dependencies in the pyproject.toml or use the command line:
      • You can add a core dependency via poetry add, e.g. poetry add pandas
      • You can add dependencies to a group (adding to a group is optional) using poetry add pytest --group test
      • You can install the dependencies from poetry.lock, including optional groups, using poetry install --with=test
      • You can update dependencies and create a poetry.lock file using poetry update
    • Run scripts with poetry run, or jsut spawn a shell in the poetry virtual environment using poetry shell and then run your code
    • Commit pyproject.toml and poetry.lock to version control

Testing

  1. To install testing dependencies, use poetry install --with=test
  2. Put your tests in tests/
  3. Run your tests by running pytest in the project directory
  4. Test coverage will be in tests/htmlcov/index.html

Intermediate

Linters, Auto-formatting and pre-commit

Because code shouldn't look awful. We will be using isort (import sorting), flake8 (python linter) and black (an autoformatter) via pre-commit.

pre-commit streamlines creating pre-commit hooks, which are run prior to a commit being accepted by git (locally). This way, your code won't be committed if there are style issues (some of which will be automatically addressed by black or isort, after which you must stage any further changes).

  1. Install the style packages using poetry install --with=style
  2. (Optional) Configure any additional pre-commit hooks in the YAML
  3. Run pre-commit install to install the hooks
  4. To run manually, you can run pre-commit run -a. Alternatively, these hooks will run as you try and commit changes
  5. (Optional) Install black extensions that auto-format on save in your favourite IDE

Automated testing and publishing to PyPI

Both of these can be achieved via GitHub Actions.

Note that some testing config is specified in the pyproject.toml.

  1. The workflow is located here. It is commented to give you an understanding of what it does
    1. Automatically runs linting and autoformatting as above
    2. If that passes, then runs your code tests across Mac and Ubuntu for a couple of Python versions
    3. If a GitHub release is created based on a Git tag, it will build the package and upload to PyPI
      • To get this to work, you will need to add your PyPI username and password as GitHub secrets
  2. Uncomment the lines specified. This should allow the workflow to run on a push, pull-request or when manually triggered. Note that publishing to PyPI is only triggered on a release
  3. Activate the workflow. Do this by navigating to the Actions tab, selecting ... and activating it.

Advanced

If you've made it this far, well done. Prepare for the most tricky bit: documentation

This section is a WIP. We will add to it as we come across good resources.

Documentation

Documentation is located in the docs folder.

This project uses:

  1. Sphinx to generate documentation. Sphinx is based on reStructuredText.
    • We use several Sphinx extensions that make using Sphinx easier
      • autodoc which will help you automatically generate the documentation from the docstrings in your code
      • napoleon which lets you write docstrings in your code using NumPy or Google style docstrings (as opposed to reStructuredText)
    • Sphinx is configured in conf.py
  2. MyST, a parser which optionally lets you write your documentation using Markdown. If you know Markdown, this can reduce, but not eliminate, the need for reStructuredText.
  3. readthedocs to host our documentation online. You'll need to link RtD to your repo (see here). Settings can be configured in the YAML
Gotcha: clearing your browser cache

If you make changes to your docs, successfully build it locally (see below) or on RtD and then see that no change has been made, your browser may be caching the old version of the docs. Clear your browser cache and then try again.

Building locally

First, install the packages required for buildings docs using poetry install --with=docs

You can test whether your documentation builds locally by using the commands offered by the Makefile. To do this, change directory to docs and run make to see build options. The easiest option is make html.

Sphinx tutorials

There is a fair bit to learn to be able to write docs. Even if you use MyST, you will need to learn about roles and directives.

Here are some tutorials:

Examples

The source folder in this template repo contains basics for making docs. There is also an example of the markdown file used to generate the API section of the nemseer docs.

You can also refer to:

Tool Config

  • flake8 is configured by .flake8
  • pytest, isort and mypy (not included) can be configured in the pyproject.toml
  • See relevant sections above for config for pre-commit, read-the-docs and Sphinx

Contributing

Interested in contributing? Check out the contributing guidelines, which also includes steps to install your_package for development.

Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.

License

your_package was created by your_name. It is licensed under the terms of the licence_type.

Credits

This template was created using cookiecutter, the py-pkgs-cookiecutter template and using Marwan Debbiche's excellent walkthrough

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nem_bidding_dashboard-0.4.0.tar.gz (19.8 kB view details)

Uploaded Source

Built Distribution

nem_bidding_dashboard-0.4.0-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file nem_bidding_dashboard-0.4.0.tar.gz.

File metadata

  • Download URL: nem_bidding_dashboard-0.4.0.tar.gz
  • Upload date:
  • Size: 19.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.8.6 Windows/10

File hashes

Hashes for nem_bidding_dashboard-0.4.0.tar.gz
Algorithm Hash digest
SHA256 22813995a596615681c6ccb3261e8060cb13234afe1e9911f531653bf262a764
MD5 f0e876d71a5a4e7274e42c917f417f59
BLAKE2b-256 57b25280ea95185b0bfab16495fd521ed3d49d71f45fae2702883cf650d647d9

See more details on using hashes here.

Provenance

File details

Details for the file nem_bidding_dashboard-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for nem_bidding_dashboard-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 028280d38629077c96f3a68c7fe90644163a46c131f112196a8c1dd49c64a932
MD5 64237c0e7aa356e671e5fe0da800acda
BLAKE2b-256 8f4d17435de2e5659487d377312f8eb34fb3529252f44e1d07dcdf3459ae07be

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page