Skip to main content

Variational Animal Motion Embedding.

Project description

image

🌟 Welcome to EthoML/VAME (Variational Animal Motion Encoding), an open-source machine learning tool for behavioral action segmentation and analyses.

VAME documentation.

Clear here to read the NEW peer-reviewed and open-access neuroscience article in Cell Reports.

We are a group of behavioral enthusiasts, comprising the original VAME developers Kevin Luxem and Pavol Bauer, behavioral neuroscientists Stephanie R. Miller and Jorge J. Palop, and computer scientists and statisticians Alex Pico, Reuben Thomas, and Katie Ly. Our aim is to provide scalable, unbiased and sensitive approaches for assessing mouse behavior using computer vision and machine learning approaches.

We are focused on the expanding the analytical capabilities of VAME segmentation by providing curated scripts for VAME implementation and tools for data processing, visualization, and statistical analyses.

Recent Improvements to VAME

  • Curated scripts for VAME implementation
  • Addition of compatability with DeepLabCut, SLEAP, and LightningPose
  • Addition of compatability with movement for data ingestion
  • Addition of a new cost function for community dendrogram generation
  • Addition of a new egocentric alignment method
  • Refined output filename structure

Authors and Code Contributors

VAME was developed by Kevin Luxem and Pavol Bauer (Luxem et. al., 2022). The original VAME repository was deprecated, forked, and is now being maintained here at https://github.com/EthoML/VAME.

The development of VAME is heavily inspired by DeepLabCut. As such, the VAME project management codebase has been adapted from the DeepLabCut codebase. The DeepLabCut 2.0 toolbox is © A. & M.W. Mathis Labs deeplabcut.org, released under LGPL v3.0. The implementation of the VRAE model is partially adapted from the Timeseries clustering repository developed by Tejas Lodaya.

VAME in a Nutshell

VAME is a framework to cluster behavioral signals obtained from pose-estimation tools. It is a PyTorch-based deep learning framework which leverages the power of recurrent neural networks (RNN) to model sequential data. In order to learn the underlying complex data distribution, we use the RNN in a variational autoencoder setting to extract the latent state of the animal in every step of the input time series. The workflow of VAME consists of 5 steps and we explain them in detail here

Installation

To get started we recommend using Anaconda with Python 3.11 or higher. Here, you can create a virtual enviroment to store all the dependencies necessary for VAME. You can also use the environment-<os>.yaml files supplied here, by simply opening the terminal, running git clone https://github.com/LINCellularNeuroscience/VAME.git, then typ cd VAME then run: conda env create -f environment-<os>.yaml).

  • Go to the locally cloned VAME directory and run python setup.py install in order to install VAME in your active conda environment.
  • Install the current stable Pytorch release using the OS-dependent instructions from the Pytorch website. Currently, VAME is tested on PyTorch 2.2.2. (Note, if you use the conda file we supply, PyTorch is already installed and you don't need to do this step.)

Getting Started

First, you should make sure that you have a GPU powerful enough to train deep learning networks. In our original 2022 paper, we were using a single Nvidia GTX 1080 Ti GPU to train our network. A hardware guide can be found here. VAME can also be trained in Google Colab or on a HPC cluster. Once you have your computing setup ready, begin using VAME by following the workflow guide.

Once you have VAME installed, you can try VAME out on a set of mouse behavioral videos and .csv files publicly available in the examples folder.

References

New 2024 Miller et al.: Machine learning reveals prominent spontaneous behavioral changes and treatment efficacy in humanized and transgenic Alzheimer's disease models
Original 2022 Luxem et al.: Identifying Behavioral Structure from Deep Variational Embeddings of Animal Motion
See also:
Mocellin et al.: A septal-ventral tegmental area circuit drives exploratory behavior
Kingma & Welling: Auto-Encoding Variational Bayes
Pereira & Silveira: Learning Representations from Healthcare Time Series Data for Unsupervised Anomaly Detection

License: GPLv3

See the LICENSE file for the full statement.

Code Reference (DOI)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vame_py-0.12.0.tar.gz (101.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vame_py-0.12.0-py3-none-any.whl (120.1 kB view details)

Uploaded Python 3

File details

Details for the file vame_py-0.12.0.tar.gz.

File metadata

  • Download URL: vame_py-0.12.0.tar.gz
  • Upload date:
  • Size: 101.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for vame_py-0.12.0.tar.gz
Algorithm Hash digest
SHA256 a9b90d4fbc36fb4ed4dc8c0dacb1dacb550288e98058964a2f40a96e1fcb5e3d
MD5 62d539a9d24c57e9bd5bfe01b3ff2e1a
BLAKE2b-256 bd25a8ed45dc2f87bda63fe50edebaf2e1bebdd4f067d3e451c0e456b4527b4d

See more details on using hashes here.

File details

Details for the file vame_py-0.12.0-py3-none-any.whl.

File metadata

  • Download URL: vame_py-0.12.0-py3-none-any.whl
  • Upload date:
  • Size: 120.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for vame_py-0.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 29d2b2b6c8042a019172ce2130ff64176e474413dd357b5102c8aec48b76b6dd
MD5 b35258773148d0d111496f68a0e96742
BLAKE2b-256 4d18173780d272b3d502e7e39e2c482c81d5248274840b59e67871c22b5f25b2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page