Skip to main content

VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition

Project description

VPRTempo - Temporally encoded spiking neural network for visual place recognition

PyTorch License: MIT stars QUT Centre for Robotics GitHub repo size

This repository contains code for VPRTempo, a spiking neural network that uses temporally encoding to perform visual place recognition tasks. The network is based off of BLiTNet and adapted to the VPRSNN framework.

VPRTempo method diagram

VPRTempo is built on a torch.nn network that employs custom learning rules based on the temporal codes of spikes in order to train layer weights.

In this repository, we provide two networks:

  • VPRTempo: Our base network architecture to perform visual place recognition (fp32)
  • VPRTempoQuant: A modified base network with Quantization Aware Training (QAT) enabled (int8)

To use VPRTempo, please follow the instructions below for installation and usage.

License & Citation

This repository is licensed under the MIT License

If you use our code, please cite the following paper:

@misc{hines2023vprtempo,
      title={VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition}, 
      author={Adam D. Hines and Peter G. Stratton and Michael Milford and Tobias Fischer},
      year={2023},
      eprint={2309.10225},
      archivePrefix={arXiv},
      primaryClass={cs.RO}
}

Installation and setup

VPRTempo uses PyTorch with CUDA GPU acceleration. Follow the installation instructions based on your operating system and hardware specifications. MacOS has no compatibly with CUDA.

Get the repository

Download the Github repository.

git clone https://github.com/QVPR/VPRTempo.git
cd ~/VPRTempo

Once downloaded, please install the required dependencies to run the network through one of the following options:

Option 1: Pip install

Dependencies for VPRTempo can downloaded from our PyPi package.

# For Windows/Linux systems
!pip install vprtempo

# For MacOS
!pip install vprtempomacos

Option 2: Local requirements install

Dependencies can be installed either through our provided requirements.txt files.

# For Windows/Linux
!pip install -r requirements.txt

# For MacOS
!pip instal -r requirements_macos.txt

Option 3: Conda install

:heavy_exclamation_mark: Recommended: Use Mambaforge instead of conda.

# Windows/Linux - CUDA enabled
conda create -n vprtempo -c pytorch -c nvidia python torchvision torchaudio pytorch-cuda=11.7 cudatoolkit prettytable tqdm numpy pandas scikit-learn

# Windows/Linux - CPU only
conda create -n vprtempo python pytorch torchvision torchaudio cpuonly prettytable tqdm numpy pandas scikit-learn -c pytorch

# MacOS
conda create -n vprtempo -c conda-forge python prettytable tqdm numpy pandas scikit-learn -c pytorch pytorch::pytorch torchvision torchaudio

Datasets

VPRTempo was developed to be simple to train and test a variety of datasets. Please see the information below about running a test with the Nordland traversal dataset and how to organize custom datasets.

Nordland

VPRTempo was developed and tested using the Nordland traversal dataset. This software will work for either the full-resolution or down-sampled datasets, however our paper details the full-resolution datasets.

To simplify first usage, we have set the defaults in VPRTempo.py to train and test on a small subset of Nordland data. We recommend downloading Nordland and using the ./src/nordland.py script to unzip and organize the images into the correct file and naming structure.

Custom datasets

In general, data should be organised in the ./dataset folder in the following way in order to train the network on multiple traversals of the same location.

--dataset
  |--training
  |  |--traversal_1
  |  |--traversal_2
  |
  |--testing
  |  |--test_traversal

If you wish to specify a different directory where data is stored, modify the --data_dir default argument in main.py. Similarly, if you wish to train/query different traversals modify --database_dirs and --query_dir in main.py accordingly.

Usage

Both the training and testing is handled by the VPRTempo.py script. Initial installs do not contain any pre-defined networks and will need to be trained prior to use.

Pre-requisites

  • Training and testing data is organized as above (see Datasets on how to set up the Nordland or custom datasets)
  • The VPRTempo conda environment has been activated

Once these two things have been setup, run VPRTempo.py to train and test your first network with the default settings.

Issues, bugs, and feature requests

If you encounter problems whilst running the code or if you have a suggestion for a feature or improvement, please report it as an issue.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

VPRTempo-0.2.0.tar.gz (6.8 kB view hashes)

Uploaded Source

Built Distribution

VPRTempo-0.2.0-py3-none-any.whl (6.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page