Skip to main content

This extension is developed to extend NWB data standards to incorporate DBS(DeepBrainStimulation) experiments.

Project description

ndx-dbs Extension for NWB

This extension is developed to extend NWB data standards to incorporate required (meta)data for DBS experiments. DBSGroup, the main neurodata-type in this extension, in fact extends the LabMetaData which itself extends the NWBContainer base type and incorporates data types of DBSMeta(as an extension of LabMetaData), DBSSubject(as an extension of LabMetaData) and DBSDevice(as an extension of Device) which itself includes DBSElectrodes(as an extension of DynamicTable). Instances of these data types are interlinked to each other to account for the comprehensiveness of all the required meta(data) in a general experiment including DBS.

Installation

Simply clone the repo and navigate to the root directory, then:

pip install .

Test

A roundTrip test is runnable through pytest from the root. The test script can be found here:

\src\pynwb\tests

An example use-case

The following is an example use case of ndx-dbs with explanatory comments. First, we build up an nwb_file and define an endpoint recording device:

from datetime import datetime
from uuid import uuid4
from dateutil.tz import tzlocal
from pynwb import NWBHDF5IO, NWBFile

from ndx_dbs import (
    DBSDevice,
    DBSElectrodes,
    DBSMeta,
    DBSSubject,
    DBSGroup
)

nwb_file = NWBFile(
    session_description='DBS mock session',
    identifier=str(uuid4()),
    session_start_time=datetime.now(tzlocal()),
    experimenter='experimenter',
    lab='ChiWangLab',
    institution='UKW',
    experiment_description='',
    session_id='',
)

# define an endpoint main recording device
main_device = nwb_file.create_device(
    name='endpoint_recording_device',
    description='description_of_the_ERD',  # ERD: Endpoint recording device
    manufacturer='manufacturer_of_the_ERD'
)

Then, we define an instance of DBSElectrodes to represent the meta-data on the recording electrodes:

'''
creating an DBS electrodes table
as a DynamicTable
'''
dbs_electrodes_table = DBSElectrodes(
    description='descriptive meta-data on DBS stimulus electrodes'
)

# add electrodes
dbs_electrodes_table.add_row(
    el_id='el_0',
    polarity='negative electrode (stimulation electrode, cathode)',
    impedance='0.8 MOhm',
    length='X cm',
    tip='tip surface ~ XX micrometer sq',
    material='platinum/iridium',
    location='STN',
    comment='none',
)
dbs_electrodes_table.add_row(
    el_id='el_1',
    polarity='positive electrode (reference electrode, anode)',
    impedance='1 MOhm',
    length='Y cm',
    tip='tip surface ~ YY micrometer sq',
    material='platinum/iridium',
    location='scalp surface',
    comment='distance D from el_0',
)
# adding the object of DynamicTable
nwb_file.add_acquisition(dbs_electrodes_table)  # storage point for DT

Now, we can define an instance of DBSDevice:

# define an DBSDevice-type device for ecg recording
dbs_device = DBSDevice(
    name='DBS_device',
    description='cable-bound multichannel systems stimulus generator; TypeSTG4004',
    manufacturer='MultichannelSystems, Reutlingen, Germany',
    synchronization='taken care of via ...',
    electrodes_group=dbs_electrodes_table,
    endpoint_recording_device=main_device
)
# adding the object of DBSDevice
nwb_file.add_device(dbs_device)

And also an instance of DBSMeta to store the meta-data for a DBS experiment:

dbs_meta_group = DBSMeta(
    name='DBS_meta',
    stim_state='ON',
    stim_type='unipolar',
    stim_area='STN',
    stim_coordinates='–3.6mmAP, either–2.5mm (right) or 12.5mm(left)ML, and–7.7mmDV',
    pulse_shape='rectangular',
    pulse_width='60 micro-seconds',
    pulse_frequency='130 Hz',
    pulse_intensity='50 micro-Ampere',
    charge_balance='pulse symmetric; set to be theoretically zero',
)
# adding the object of DBSMeta
nwb_file.add_lab_meta_data(dbs_meta_group)  # storage point for custom LMD

Along with an instance of DBSSubject:

dbs_subject_group = DBSSubject(
    name='DBS_subject',
    model='6-OHDA',
    controls='specific control group in this experiment',
    comment='any comments on this subject',
)
# adding the object of DBSSubject
nwb_file.add_lab_meta_data(dbs_subject_group)  # storage point for custom LMD

Now that we have all the required components, we define the main group for DBS to connect them all:

dbs_main_group = DBSGroup(
    name='DBS_main_container',
    DBS_phase='first phase after implementation recovery',
    DBS_device=dbs_device,
    DBS_meta=dbs_meta_group,
    DBS_subject=dbs_subject_group,
    comment='any comments ...',
)
# adding the object of DBSSubject
nwb_file.add_lab_meta_data(dbs_main_group)  # storage point for custom LMD

Now, the nwb_file is ready to be written on the disk and read back.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ndx-dbs-0.1.0.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

ndx_dbs-0.1.0-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file ndx-dbs-0.1.0.tar.gz.

File metadata

  • Download URL: ndx-dbs-0.1.0.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for ndx-dbs-0.1.0.tar.gz
Algorithm Hash digest
SHA256 35e7d7dd3f36d33e7c38e4dcb39ebce7b7d49604ef56e938476805bbffbdd712
MD5 9502841eea745a6414701904a4552cf3
BLAKE2b-256 17a9d0f2b22665548adfcba88a71f714b9ad1e7e56fd1cb2b4644822632fa624

See more details on using hashes here.

File details

Details for the file ndx_dbs-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: ndx_dbs-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 6.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for ndx_dbs-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e21f05d693aa5b23ba69640ae80e2582869d89f152160ba054161e45198d575a
MD5 97bd9953e09e6d2ec24199840bbe25cd
BLAKE2b-256 c40ea2b35832215fe301432bd2fb2c3c3158a8845f522b34b2b1b06be8d860a7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page