Tool for analyzing cell closeness in spatial transcriptomic data
Project description
Developer documentation
Welcome to the developer guidelines! This document is split into two parts:
- The repository setup. This section is relevant primarily for the repository maintainer and shows how to connect continuous integration services and documents initial set-up of the repository.
- The contributor guide. It contains information relevant to all developers who want to make a contribution.
Setting up the repository
Documentation on readthedocs
We recommend using readthedocs.org (RTD) to build and host the documentation for your project. To enable readthedocs, head over to their webiste and sign in with your GitHub account. On the RTD dashboard choose "Import a Project" and follow the instructions to add your repository.
- Make sure to choose the correct name of the default branch. On GitHub, the default name of the default branch has
recently changed from
master
tomain
. - We recommend to enable documentation builds for pull requests (PRs). This ensures that a PR doesn't introduce changes
that break the documentation. To do so, got to
Admin -> Advanced Settings
, check theBuild pull requests for this projects
option, and clickSave
. For more information, please refer to the official RTD documentation. - If you find the RTD builds are failing, you can disable the
fail_on_warning
option in.readthedocs.yaml
.
Coverage tests with Codecov
Coverage tells what fraction of the code is "covered" by unit tests, thereby encouraging contributors to write tests. To enable coverage checks, head over to codecov and sign in with your GitHub account. You'll find more information in "getting started" section of the codecov docs.
In brief, you need to:
- Generate a Codecov Token by clicking setup repo in the codecov dashboard.
- Go to the Settings of your newly created repository on GitHub.
- Go to Security > Secrets > Actions.
- Create new repository secret with name
CODECOV_TOKEN
and paste the token generated by codecov - Go back to Github Actions page an re-run previously failed jobs.
Pre-commit checks
Pre-commit checks are fast programs that check code for errors, inconsistencies and code styles, before the code is committed.
We recommend setting up pre-commit.ci to enforce consistency checks on every commit and pull-request.
To do so, head over to pre-commit.ci and click "Sign In With GitHub". Follow the instructions to enable pre-commit.ci for your account or your organization. You may choose to enable the service for an entire organization or on a per-repository basis.
Once authorized, pre-commit.ci should automatically be activated.
Overview of pre-commit hooks used by the template
The following pre-commit checks are for code style and format:
- black: standard code formatter in Python.
- isort: sort module imports into sections and types.
- prettier: standard code formatter for non-Python files (e.g. YAML).
- blacken-docs: black on python code in docs.
The following pre-commit checks are for errors and inconsistencies:
- flake8: standard check for errors in Python files.
- flake8-tidy-imports: tidy module imports.
- flake8-docstrings: pydocstyle extension of flake8.
- flake8-rst-docstrings:
extension of
flake8-docstrings
forrst
docs. - flake8-comprehensions: write better list/set/dict comprehensions.
- flake8-bugbear: find possible bugs and design issues in program.
- flake8-blind-except:
checks for blind, catch-all
except
statements.
- yesqa:
remove unneccesary
# noqa
comments, follows additional dependencies listed above. - autoflake: remove unused imports and variables.
- pre-commit-hooks: generic pre-commit hooks.
- detect-private-key: checks for the existence of private keys.
- check-ast: check whether files parse as valid python.
- end-of-file-fixer:check files end in a newline and only a newline.
- mixed-line-ending: checks mixed line ending.
- trailing-whitespace: trims trailing whitespace.
- check-case-conflict: check files that would conflict with case-insensitive file systems.
- pyupgrade: upgrade syntax for newer versions of the language.
- forbid-to-commit: Make sure that
*.rej
files cannot be commited. These files are created by the automated template sync if there's a merge conflict and need to be addressed manually.
Notes on pre-commit checks
- To ignore lint warnigs from flake8, see Ignore certain lint warnings.
- You can add or remove pre-commit checks by simply deleting relevant lines in the
.pre-commit-config.yaml
file. Some pre-commit checks have additional options that can be specified either in thepyproject.toml
or tool-specific config files, such as.prettierrc.yml
for prettier and.flake8
for flake8.
API design
Scverse ecosystem packages should operate on AnnData and/or MuData datastructures and typically use an API as originally introduced by scanpy with the following submodules:
pp
for preprocessingtl
for tools (that, compared topp
generate interpretable output, often associated with a corresponding plotting function)pl
for plotting functions
You may add additional submodules as appropriate. While we encourage to follow a scanpy-like API for ecosystem packages, there may also be good reasons to choose a different approach, e.g. using an object-oriented API.
Ignore certain lint warnings
The pre-commit checks include flake8 which checks for errors in Python files, including stylistic errors.
In some cases it might overshoot and you may have good reasons to ignore certain warnings.
To ignore an specific error on a per-case basis, you can add a comment # noqa
to the offending line. You can also
specify the error ID to ignore, with e.g. # noqa: E731
. Check the flake8 guide for reference.
Alternatively, you can disable certain error messages for the entire project. To do so, edit the .flake8
file in the root of the repository. Add one line per linting code you wish to ignore and don't forget to add a comment.
...
# line break before a binary operator -> black does not adhere to PEP8
W503
# line break occured after a binary operator -> black does not adhere to PEP8
W504
...
Using VCS-based versioning
By default, the template uses hard-coded version numbers that are set in pyproject.toml
and managed with
bump2version. If you prefer to have your project automatically infer version numbers from git
tags, it is straightforward to switch to vcs-based versioning using hatch-vcs.
In pyproject.toml
add the following changes, and you are good to go!
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,11 +1,11 @@
[build-system]
build-backend = "hatchling.build"
-requires = ["hatchling"]
+requires = ["hatchling", "hatch-vcs"]
[project]
name = "monkeybread"
-version = "0.3.1dev"
+dynamic = ["version"]
@@ -60,6 +60,9 @@
+[tool.hatch.version]
+source = "vcs"
+
[tool.coverage.run]
source = ["monkeybread"]
omit = [
Don't forget to update the Making a release section in this document accordingly, after you are done!
Contributing guide
Scanpy provides extensive developer documentation, most of which applies to this repo, too. This document will not reproduce the entire content from there. Instead, it aims at summarizing the most important information to get you started on contributing.
We assume that you are already familiar with git and with making pull requests on GitHub. If not, please refer to the scanpy developer guide.
Installing dev dependencies
In addition to the packages needed to use this package, you need additional python packages to run tests and build
the documentation. It's easy to install them using pip
:
pip install "monkeybread[dev,test,doc]"
Code-style
This template uses pre-commit to enforce consistent code-styles. On every commit, pre-commit checks will either automatically fix issues with the code, or raise an error message. See pre-commit checks for a full list of checks enabled for this repository.
To enable pre-commit locally, simply run
pre-commit install
in the root of the repository. Pre-commit will automatically download all dependencies when it is run for the first time.
Alternatively, you can rely on the pre-commit.ci service enabled on GitHub. If you didn't run pre-commit
before
pushing changes to GitHub it will automatically commit fixes to your pull request, or show an error message.
If pre-commit.ci added a commit on a branch you still have been working on locally, simply use
git pull --rebase
to integrate the changes into yours.
Finally, most editors have an autoformat on save feature. Consider enabling this option for black and prettier.
Writing tests
This package uses the pytest for automated testing. Please write tests for every function added to the package.
Most IDEs integrate with pytest and provide a GUI to run tests. Alternatively, you can run all tests from the command line by executing
pytest
in the root of the repository. Continuous integration will automatically run the tests on all pull requests.
Automated template sync
Automated template sync is enabled by default. This means that every night, a GitHub action runs cruft to check
if a new version of the scverse-cookiecutter
template got released. If there are any new changes, a pull request
proposing these changes is created automatically. This helps keeping the repository up-to-date with the latest
coding standards.
It may happen that a template sync results in a merge conflict. If this is the case a *.ref
file with the
diff is created. You need to manually address these changes and remove the .rej
file when you are done.
The pull request can only be merged after all *.rej
files have been removed.
:::{tip} The following hints may be useful to work with the template sync:
- GitHub automatically disables scheduled actions if there has been not activity to the repository for 60 days.
You can re-enable or manually trigger the sync by navigating to
Actions
->Sync Template
in your GitHub repository. - If you want to ignore certain files from the template update, you can add them to the
[tool.cruft]
section in thepyproject.toml
file in the root of your repository. More details are described in the cruft documentation. - To disable the sync entirely, simply remove the file
.github/workflows/sync.yaml
.
:::
Making a release
Updating the version number
Before making a release, you need to update the version number. Please adhere to Semantic Versioning, in brief
Given a version number MAJOR.MINOR.PATCH, increment the:
- MAJOR version when you make incompatible API changes,
- MINOR version when you add functionality in a backwards compatible manner, and
- PATCH version when you make backwards compatible bug fixes.
Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.
We use bump2version to automatically update the version number in all places and automatically create a git tag. Run one of the following commands in the root of the repository
bump2version patch
bump2version minor
bump2version major
Once you are done, run
git push --tags
to publish the created tag on GitHub.
Upload on PyPI
Please follow the Python packaging tutorial.
It is possible to automate this with GitHub actions, see also this feature request in the cookiecutter-scverse template.
Writing documentation
Please write documentation for your package. This project uses sphinx with the following features:
- the myst extension allows to write documentation in markdown/Markedly Structured Text
- Numpy-style docstrings (through the napoloen extension).
- Jupyter notebooks as tutorials through myst-nb (See Tutorials with myst-nb)
- Sphinx autodoc typehints, to automatically reference annotated input and output types
See the scanpy developer docs for more information on how to write documentation.
Tutorials with myst-nb and jupyter notebooks
The documentation is set-up to render jupyter notebooks stored in the docs/notebooks
directory using myst-nb.
Currently, only notebooks in .ipynb
format are supported that will be included with both their input and output cells.
It is your reponsibility to update and re-run the notebook whenever necessary.
If you are interested in automatically running notebooks as part of the continuous integration, please check
out this feature request in the cookiecutter-scverse
repository.
Hints
- If you refer to objects from other packages, please add an entry to
intersphinx_mapping
indocs/conf.py
. Only if you do so can sphinx automatically create a link to the external documentation. - If building the documentation fails because of a missing link that is outside your control, you can add an entry to
the
nitpick_ignore
list indocs/conf.py
Building the docs locally
cd docs
make html
open _build/html/index.html
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for monkeybread-0.5.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bc0c8ca76d31932a1d045e2b97628c3b403a9f397013dca24a97a128f12e815b |
|
MD5 | a63ede19df33be18911478bee5a0aea8 |
|
BLAKE2b-256 | 11ff55e63220ed1643e52796f0c5ffb9e9954e0c4b93f3258592c44527c698c2 |