Skip to main content

Advanced Recording Format for acoustic, behavioral, and physiological data

Project description

ProjectStatus Version BuildStatus License PythonVersions

The Advanced Recording Format arf is an open standard for storing data from neuronal, acoustic, and behavioral experiments in a portable, high-performance, archival format. The goal is to enable labs to share data and tools, and to allow valuable data to be accessed and analyzed for many years in the future.

arf is built on the the HDF5 format, and all arf files are accessible through standard HDF5 tools, including interfaces to HDF5 written for other languages (e.g. MATLAB, Python, etc). arf comprises a set of specifications on how different kinds of data are stored. The organization of arf files is based around the concept of an entry, a collection of data channels associated with a particular point in time. An entry might contain one or more of the following:

  • raw extracellular neural signals recorded from a multichannel probe

  • spike times extracted from neural data

  • acoustic signals from a microphone

  • times when an animal interacted with a behavioral apparatus

  • the times when a real-time signal analyzer detected vocalization

Entries and datasets have metadata attributes describing how the data were collected. Datasets and entries retain these attributes when copied or moved between arf files, helping to prevent data from becoming orphaned and uninterpretable.

This repository contains:

  • The specification for arf (in specification.md). This is also hosted at https://meliza.org/spec:1/arf/.

  • A fast, type-safe C++ interface for reading and writing arf files

  • A python interface for reading and writing arf files

You don’t need the python or C++ libraries to read arf files; they are just standard HDF5 files that can be accessed with standard tools and libraries, like h5py (see below).

installation

ARF files require HDF5>=1.8 (http://www.hdfgroup.org/HDF5).

The python interface requires Python 3.7 or greater and h5py>=3.8. The last version of this package to support Python 2 was 2.5.1. The last version to support h5py 2 was 2.6.7. To install the module:

pip install arf

To use the C++ interface, you need boost>=1.42 (http://boost.org). In addition, if writing multithreaded code, HDF5 needs to be compiled with --enable-threadsafe. The interface is header-only and does not need to be compiled. To install:

make install

version information

The specification and implementations provided in this project use a form of semantic versioning (http://semver.org). Specifications receive a major and minor version number. Changes to minor version numbers must be backwards compatible (i.e., only added requirements). The current released version of the ARF specification is 2.1.

Implementation versions are synchronized with the major version of the specification but otherwise evolve independently. For example, the python arf package version 2.1.0 is compatible with any ARF version 2.x.

There was no public release of arf prior to 2.0.

access ARF files with HDF5 tools

The structure of an arf file can be explored using the h5ls tool. For example, to list entries:

$ h5ls file.arf
test_0001                Group
test_0002                Group
test_0003                Group
test_0004                Group

Each entry appears as a Group. To list the contents of an entry, use path notation:

$ h5ls file.arf/test_0001
pcm                      Dataset {609914}

This shows that the data in test_0001 is stored in a single node, pcm}, with 609914 data points. Typically each channel will have its own dataset.

The h5dump command can be used to output data in binary format. See the HDF5 documentation for details on how to structure the output. For example, to extract sampled data to a 16-bit little-endian file (i.e., PCM format):

h5dump -d /test_0001/pcm -b LE -o test_0001.pcm file.arf

contributing

ARF is under active development and we welcome comments and contributions from neuroscientists and behavioral biologists interested in using it. We’re particularly interested in use cases that don’t fit the current specification. Please post issues or contact Dan Meliza (dan at meliza.org) directly.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arf-2.7.1.tar.gz (10.1 kB view details)

Uploaded Source

Built Distribution

arf-2.7.1-py3-none-any.whl (13.0 kB view details)

Uploaded Python 3

File details

Details for the file arf-2.7.1.tar.gz.

File metadata

  • Download URL: arf-2.7.1.tar.gz
  • Upload date:
  • Size: 10.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for arf-2.7.1.tar.gz
Algorithm Hash digest
SHA256 ae769b3d9d66c25a9c472ec466d090f276743cb5b0602e116aada4930346009b
MD5 f47e0750c9458482ab293f6a143b0b54
BLAKE2b-256 dc1886eca0b2bdd8da86662ed04fa9833a22422c9c5118435a136d98dfc5f3a3

See more details on using hashes here.

Provenance

The following attestation bundles were made for arf-2.7.1.tar.gz:

Publisher: publish-to-pypi.yml on melizalab/arf

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file arf-2.7.1-py3-none-any.whl.

File metadata

  • Download URL: arf-2.7.1-py3-none-any.whl
  • Upload date:
  • Size: 13.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for arf-2.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9a8f3806caae069fc1569abdab7c021878c36273c660ac5944a2dc51412e6c3c
MD5 63414c52ea238551966252463059fb66
BLAKE2b-256 0fcf3b8d8e30b5ff0a6e4aae62ef8ae7167f811bf30d434f10ebedb954ebfed0

See more details on using hashes here.

Provenance

The following attestation bundles were made for arf-2.7.1-py3-none-any.whl:

Publisher: publish-to-pypi.yml on melizalab/arf

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page