Python tools for participating in Neural Latents Benchmark '21
Project description
NLB Codepack (nlb_tools)
Python tools for participating in Neural Latents Benchmark '21.
Overview
Neural Latents Benchmark '21 (NLB'21) is a benchmark suite for unsupervised modeling of neural population activity. The suite includes four datasets spanning a variety of brain areas and experiments. The primary task in the benchmark is co-smoothing, or inference of firing rates of unseen neurons in the population.
This repo contains code to facilitate participation in NLB'21:
nlb_tools/
has code to load and preprocess our dataset files, format data for modeling, and locally evaluate resultsexamples/tutorials/
contains tutorial notebooks demonstrating basic usage ofnlb_tools
examples/baselines/
holds the code we used to run our baseline methods. They may serve as helpful references on more extensive usage ofnlb_tools
Installation
The package can be installed with the following command:
pip install nlb-tools
However, to run the tutorial notebooks locally or make any modifications to the code, you should clone the repo. The package can then be installed with the following commands:
git clone https://github.com/neurallatents/nlb_tools.git
cd nlb_tools
pip install -e .
This package requires Python 3.7+ and was developed in Python 3.7, which is the Python version we recommend you use.
Getting started
We recommend reading/running through examples/tutorials/basic_example.ipynb
to learn how to use nlb_tools
to load and
format data for our benchmark. You can also find Jupyter notebooks demonstrating running GPFA and SLDS for the benchmark in
examples/tutorials/
.
Other resources
For more information on the benchmark:
- our main webpage contains general information on our benchmark pipeline and introduces the datasets
- our EvalAI challenge is where submissions are evaluated and displayed on the leaderboard
- our datasets are available on DANDI: MC_Maze, MC_RTT, Area2_Bump, DMFC_RSG, MC_Maze_Large, MC_Maze_Medium, MC_Maze_Small
- our paper describes our motivations behind this benchmarking effort as well as various technical details and explanations of design choices made in preparing NLB'21
- our Slack workspace lets you interact directly with the developers and other participants. Please email
fpei6 [at] gatech [dot] edu
for an invite link
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nlb_tools-0.0.4.tar.gz
.
File metadata
- Download URL: nlb_tools-0.0.4.tar.gz
- Upload date:
- Size: 37.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e1bafc08eb2055c4075f55f8a0b2da95598b774660fa3a32186894dab645b1e3 |
|
MD5 | d27179d9b591a34e653d01d238b1442a |
|
BLAKE2b-256 | 26da4e081e80927f70b07fda2d1b874061b44c2ee484975e20f3be4b7437446c |
File details
Details for the file nlb_tools-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: nlb_tools-0.0.4-py3-none-any.whl
- Upload date:
- Size: 39.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 589b24b02d622e099e4c3aec65ebc9ff2079711a10bb092647fab8d94702016c |
|
MD5 | af7a1b6da893ff68504d9ed4d5c11952 |
|
BLAKE2b-256 | 1c9362128cb90a6eacb50ffa56aaa78c39467edef0c0b6601afc48e0159c627d |