Skip to main content

Utilities library for aind ephys team.

Project description

aind-ephys-utils

License Code Style semantic-release: angular

Helpful methods for exploring in vivo electrophysiology data.

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

Contributing

Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

  • Please test your changes using the coverage library, which will run the tests and log a coverage report:
coverage run -m unittest discover && coverage report
  • Use interrogate to check that modules, methods, etc. have been documented thoroughly:
interrogate .
  • Use black to automatically format the code into PEP standards:
black .
  • Use flake8 to check that code is up to standards (no unused imports, etc.):
flake8 .
  • Use isort to automatically sort import statements:
isort .

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Documentation

To generate the rst files source files for documentation, run

sphinx-apidoc -o doc_template/source/ src/aind_ephys_utils

Then to create the documentation HTML files, run

sphinx-build -b html doc_template/source/ doc_template/build/html

More info on sphinx installation can be found here.

Developing in Code Ocean

Members of the Allen Institute for Neural Dynamics can follow these steps to create a Code Ocean capsule from this repository:

  1. Click the ⨁ New Capsule button and select "Clone from AllenNeuralDynamics"
  2. Type in aind-ephys-utils and click "Clone" (this step requires that your GitHub credentials are configured properly)
  3. Select a Python base image, and optionally change the compute resources
  4. Attach data to the capsule and any dependencies needed to load it (e.g. pynwb, hdmf-zarr)
  5. Add plotting dependencies (e.g. ipympl, plotly)
  6. Launch a Visual Studio Code cloud workstation

Inside Visual Studio Code, select "New Terminal" from the "Terminal" menu and run the following commands:

$ pip install -e .[dev]
$ git checkout -b <name of feature branch>

Now, you can create Jupyter notebooks in the "code" directory that can be used to test out new functions before updating the library. When prompted, install the "Python" extensions to be able to execute notebook cells.

Once you've finished writing your code and tests, run the following commands:

$ coverage run -m unittest discover && coverage report
$ interrogate . 
$ black .
$ flake8 .
$ isort .

Assuming all of these pass, you're ready to push your changes:

$ git add <files to add>
$ git commit -m "Commit message"
$ git push -u origin <name of feature branch>

After doing this, you can open a pull request on GitHub.

Note that git will only track files inside the aind-ephys-utils directory, and will ignore everything else in the capsule. You will no longer be able to commit changes to the capsule itself, which is why this workflow should only be used for developing a library, and not for performing any type of data analysis.

When you're done working, it's recommended to put the workstation on hold rather than shutting it down, in order to keep Visual Studio Code in the same state.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_ephys_utils-0.0.13.tar.gz (38.0 kB view details)

Uploaded Source

Built Distribution

aind_ephys_utils-0.0.13-py3-none-any.whl (7.7 kB view details)

Uploaded Python 3

File details

Details for the file aind_ephys_utils-0.0.13.tar.gz.

File metadata

  • Download URL: aind_ephys_utils-0.0.13.tar.gz
  • Upload date:
  • Size: 38.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.2

File hashes

Hashes for aind_ephys_utils-0.0.13.tar.gz
Algorithm Hash digest
SHA256 0f85207a716e363106873b7d82a3c1ce6577e5071d8d734f83d908eb2ddd1414
MD5 ad0e7f21072e910eab1f3d1a0ddb6435
BLAKE2b-256 edd9782d1fc6ac4856692117936fccdc984b085787806b951814a6b8b449c16c

See more details on using hashes here.

File details

Details for the file aind_ephys_utils-0.0.13-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_ephys_utils-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 7b30750ae5852a755094abd78e0ff0a52fc5303d6c98cac96982bab8da1c9558
MD5 da62890a2b76f3fff6368008aba6f8ff
BLAKE2b-256 e2c0b861ae3c2145f1177199353d08268ddbdc53e2386c30d976bafd817b540a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page