Skip to main content

Generated from aind-library-template

Project description

aind-data-schema-models

License Code Style semantic-release: angular Interrogate Coverage Python

Installation

To install from pypi, run

pip install aind-data-schema-models

To install from source, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

Contributing

How to add a new model class

The model class files, brain_atlas.py etc, are auto-generated. You should never need to modify the class files directly.

Instead, take a look at the jinja2 templates in the folder _generators/templates. The filename of the template is used to pull the corresponding .csv file and populate the data DataFrame. In the template you can pull data from the various columns and use them to populate each of the fields in your class.

To re-build all the models, run the run_all.sh bash script in the root folder, which loops through the template files and runs them through the generate_code function.

There are a few special cases, e.g. if data are missing in columns they will show up as float: nan. See the organizations.txt template for examples of how to handle this.

Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

  • Please test your changes using the coverage library, which will run the tests and log a coverage report:
coverage run -m unittest discover && coverage report
  • Use interrogate to check that modules, methods, etc. have been documented thoroughly:
interrogate .
  • Use flake8 to check that code is up to standards (no unused imports, etc.):
flake8 .
  • Use black to automatically format the code into PEP standards:
black .
  • Use isort to automatically sort import statements:
isort .

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Semantic Release

The table below, from semantic release, shows which commit message gets you which release type when semantic-release runs (using the default configuration):

Commit message Release type
fix(pencil): stop graphite breaking when too much pressure applied Patch Fix Release, Default release
feat(pencil): add 'graphiteWidth' option Minor Feature Release
perf(pencil): remove graphiteWidth option

BREAKING CHANGE: The graphiteWidth option has been removed.
The default graphite width of 10mm is always used for performance reasons.
Major Breaking Release
(Note that the BREAKING CHANGE: token must be in the footer of the commit)

Documentation

To generate the rst files source files for documentation, run

sphinx-apidoc -o doc_template/source/ src 

Then to create the documentation HTML files, run

sphinx-build -b html doc_template/source/ doc_template/build/html

More info on sphinx installation can be found here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_data_schema_models-0.6.0.tar.gz (552.2 kB view details)

Uploaded Source

Built Distribution

aind_data_schema_models-0.6.0-py3-none-any.whl (531.4 kB view details)

Uploaded Python 3

File details

Details for the file aind_data_schema_models-0.6.0.tar.gz.

File metadata

File hashes

Hashes for aind_data_schema_models-0.6.0.tar.gz
Algorithm Hash digest
SHA256 049007ad59bd70c32d7bbc013fa1b6a1adcdaa77756ffec35f9b70219a74a2ed
MD5 3a58c87d0fcf9a280ab26c1e57c7f6d8
BLAKE2b-256 98478883b8b61b6ee3b961b7c50410214a778b77fff98f6abb4c2dbef9e6bf23

See more details on using hashes here.

File details

Details for the file aind_data_schema_models-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_data_schema_models-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4094fa0b42c1d8f2cb43d8048b1e02a93d6b2f74e7529f9ed13bf41b62dd2228
MD5 bbe75c369f689750ea02d16e09800602
BLAKE2b-256 63ee480fbf2e4955924dde983d4290fca2f2b63bde694f80efce4cd483575ae2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page