Skip to main content

Generated from aind-library-template

Project description

aind-behavior-utils

License Code Style semantic-release: angular Interrogate Coverage Python

Overview

AIND behavior utilities provides tools for working with behavioral neuroscience data, including:

  • SyncDataset — Load and parse HDF5 sync files with automatic label resolution and edge extraction
  • CamstimDataset — Parse stimulus pickle files and extract frame timing and wheel encoder data
  • Wheel utilities — Generate QC images and metrics for wheel encoder validation
  • Plotting utilities — Generic array visualization helpers

Quick Start

Loading a sync file

from aind_behavior_utils.sync import SyncDataset

sync = SyncDataset('path/to/sync.h5')
print(sync.label_map)  # View resolved line labels

# Get edge timestamps for a specific line
edges = sync.get_edges('camera', slope='rising')
print(f"Camera edges: {edges}")

# Check for dropped events
dropped = sync.get_dropped_events('vsync', timing_threshold=0.025)
print(f"Dropped vsync events: {dropped}")

sync.close()

Parsing stimulus files

from aind_behavior_utils.stimulus import CamstimDataset

# Load from pickle file
stim = CamstimDataset.from_file('path/to/stimulus.pkl')
print(f"Frame rate: {stim.fps} Hz")
print(f"Total frames: {stim.stim_frame_count}")

# Or load from dict
import pandas as pd
with open('path/to/stimulus.pkl', 'rb') as f:
    pkl_data = pd.read_pickle(f)
stim = CamstimDataset(pkl_data)

Wheel encoder QC

from aind_behavior_utils.stimulus import wheel_utils

# Generate QC images and metrics
images = wheel_utils.calculate_qc_images(stim)
metrics = wheel_utils.calculate_qc_metrics(stim)

print(f"Wheel artifacts: {metrics.get('artifact_count', 0)}")

Dependencies

  • h5py — HDF5 file I/O
  • numpy — Array operations
  • pandas — Data manipulation
  • matplotlib — Plotting

Zero runtime dependencies for core sync operations beyond the above.

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

Contributing

Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

  • Please test your changes using the coverage library, which will run the tests and log a coverage report:
coverage run -m unittest discover && coverage report
  • Use interrogate to check that modules, methods, etc. have been documented thoroughly:
interrogate .
  • Use flake8 to check that code is up to standards (no unused imports, etc.):
flake8 .
  • Use black to automatically format the code into PEP standards:
black .
  • Use isort to automatically sort import statements:
isort .

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Semantic Release

The table below, from semantic release, shows which commit message gets you which release type when semantic-release runs (using the default configuration):

Commit message Release type
fix(pencil): stop graphite breaking when too much pressure applied Patch Fix Release, Default release
feat(pencil): add 'graphiteWidth' option Minor Feature Release
perf(pencil): remove graphiteWidth option

BREAKING CHANGE: The graphiteWidth option has been removed.
The default graphite width of 10mm is always used for performance reasons.
Major Breaking Release
(Note that the BREAKING CHANGE: token must be in the footer of the commit)

Documentation

To generate the rst files source files for documentation, run

sphinx-apidoc -o docs/source/ src

Then to create the documentation HTML files, run

sphinx-build -b html docs/source/ docs/build/html

More info on sphinx installation can be found here.

Read the Docs Deployment

Note: Private repositories require Read the Docs for Business account. The following instructions are for a public repo.

The following are required to import and build documentations on Read the Docs:

  • A Read the Docs user account connected to Github. See here for more details.
  • Read the Docs needs elevated permissions to perform certain operations that ensure that the workflow is as smooth as possible, like installing webhooks. If you are not the owner of the repo, you may have to request elevated permissions from the owner/admin.
  • A .readthedocs.yaml file in the root directory of the repo. Here is a basic template:
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the OS, Python version, and other tools you might need
build:
  os: ubuntu-24.04
  tools:
    python: "3.13"

# Path to a Sphinx configuration file.
sphinx:
  configuration: docs/source/conf.py

# Declare the Python requirements required to build your documentation
python:
  install:
    - method: pip
      path: .
      extra_requirements:
        - dev

Here are the steps for building docs in Read the Docs. See here for detailed instructions:

  • From Read the Docs dashboard, click on Add project.
  • For automatic configuration, select Configure automatically and type the name of the repo. A repo with public visibility should appear as you type.
  • Follow the subsequent steps.
  • For manual configuration, select Configure manually and follow the subsequent steps

Once a project is created successfully, you will be able to configure/modify the project's settings; such as Default version, Default branch etc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aind_behavior_utils-0.3.4.tar.gz (83.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aind_behavior_utils-0.3.4-py3-none-any.whl (18.5 kB view details)

Uploaded Python 3

File details

Details for the file aind_behavior_utils-0.3.4.tar.gz.

File metadata

  • Download URL: aind_behavior_utils-0.3.4.tar.gz
  • Upload date:
  • Size: 83.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for aind_behavior_utils-0.3.4.tar.gz
Algorithm Hash digest
SHA256 0fd0197da2b48817e6608043cb8317ad3a9d89598a9dcfe726691807e9cacd7c
MD5 90e68c05d9a2992d2bcad9ce917cf018
BLAKE2b-256 66867e14d3c93b6f26b1c8bb438a471be41509f2c707cc6b3a013234cbf6c516

See more details on using hashes here.

File details

Details for the file aind_behavior_utils-0.3.4-py3-none-any.whl.

File metadata

File hashes

Hashes for aind_behavior_utils-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 35f38ba98b7c2168f7c6692182538cc4897fd5f67ed60c2553a428b4c8dadb92
MD5 8bc359c249824eea52de8f90e3362dcb
BLAKE2b-256 6c461280dae9a756f9779b511eaaddb43ee1e1e1249e7389c0a11ce0b3ef23c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page