Skip to main content

No project description provided

Project description

radiens-drive-catalog

A Python package for programmatically managing large neural datasets stored on Google Drive. It handles Drive scanning, local cataloging, and selective dataset download. Analysis is done locally — this package is purely about data management.

Overview

Neural data is stored as xdat filesets (NeuroNexus format) on a shared Google Drive. Each dataset consists of 3 files sharing a common base_name:

{base_name}_data.xdat
{base_name}.xdat.json
{base_name}_timestamp.xdat

radiens-drive-catalog scans the Drive hierarchy, builds a local catalog indexed by base_name, and lets you query and download datasets selectively. Non-xdat content found alongside datasets — logs directories, PowerPoints, writeups — is also discovered and tracked as assets.

Usage

Datasets

from radiens_drive_catalog import Catalog, Config

config = Config.from_file("config.json")
catalog = Catalog(config)

# Scan Drive and build the catalog (discovers datasets and assets)
catalog.scan()

# Query datasets
catalog.list()                                                          # everything
catalog.list(date_folder="2026-02-15_batch")                            # one date folder
catalog.list(date_folder="2026-02-15_batch", experiment="reaching")     # one experiment

# Access the raw DataFrame
catalog.df

# Check what's available locally
catalog.status()

# Download a dataset
catalog.download("rat01_session3")

# Get the local path, downloading automatically if needed
path = catalog.get_path("rat01_session3")

Assets (non-xdat content)

Non-xdat files and folders (e.g. logs/, PowerPoints, writeups) found inside experiment folders are automatically cataloged as assets during scan().

# Query assets
catalog.assets_df                                                           # all assets
catalog.list_assets(date_folder="2026-02-15_batch")                         # assets in a date folder
catalog.list_assets(experiment="reaching", asset_type="folder")             # folder assets only

# Download an asset
# drive_path is the slash-joined path to the asset's parent folder
catalog.download_asset("2026-02-15_batch/reaching", "logs")

# Get the local path, downloading automatically if needed
path = catalog.get_asset_path("2026-02-15_batch/reaching", "logs")

Assets land under local_data_dir/assets/{drive_path}/{asset_name}, separate from the flat xdat dataset files.

Configuration

Create a config.json (outside your repo — do not commit it):

{
    "credentials_path": "/path/to/service_account.json",
    "root_folder_id": "your-drive-folder-id",
    "local_data_dir": "/path/to/local/data",
    "catalog_path": "/path/to/local/data/catalog.json"
}

Or set a single environment variable pointing at the config file, and call from_file() with no arguments:

export RADIENS_DRIVE_CATALOG_CONFIG=/path/to/config.json
config = Config.from_file()

The root_folder_id is the alphanumeric string in the Drive URL when you're inside the root data folder.

Authentication

This package uses a Google service account for shared access among collaborators. To set it up:

  1. Create a project in Google Cloud Console
  2. Enable the Google Drive API
  3. Create a service account and download its JSON credentials file
  4. Share your root Drive data folder with the service account's email address (Viewer access is sufficient)
  5. Point credentials_path in your config at the downloaded JSON file

Distribute the credentials file to collaborators securely — treat it like a password.

Installation

This project uses uv for dependency management. If you don't have it:

macOS / Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows:

powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

Then install the project:

uv sync

Development

uv run pytest          # run tests
uv run mypy            # type checking
uv run ruff check .    # linting
uv run ruff format .   # formatting

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

radiens_drive_catalog-0.0.2.tar.gz (117.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

radiens_drive_catalog-0.0.2-py3-none-any.whl (14.4 kB view details)

Uploaded Python 3

File details

Details for the file radiens_drive_catalog-0.0.2.tar.gz.

File metadata

  • Download URL: radiens_drive_catalog-0.0.2.tar.gz
  • Upload date:
  • Size: 117.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for radiens_drive_catalog-0.0.2.tar.gz
Algorithm Hash digest
SHA256 2bb7d499e848aa70771899c21dc03f859eb059143eb7ca565f6ade1e9a829645
MD5 852131b9351f284e515dabf973a90f4a
BLAKE2b-256 6b2dd3d564810bc4ab91e744af470c463bdb3e349c987fcea9aa49bcace350d9

See more details on using hashes here.

Provenance

The following attestation bundles were made for radiens_drive_catalog-0.0.2.tar.gz:

Publisher: publish.yml on NeuroNexus/radiens-drive-catalog

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file radiens_drive_catalog-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for radiens_drive_catalog-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b93a29a4e27a0a1633b958da7f9c1625b44c591e8544dccdcf8a4e70314d753f
MD5 c4e1788223e134dbd468acb0de65f0ce
BLAKE2b-256 5472d82707b8088d5e2af55e78ea5e0d93d1211279782a683c125d3de0960689

See more details on using hashes here.

Provenance

The following attestation bundles were made for radiens_drive_catalog-0.0.2-py3-none-any.whl:

Publisher: publish.yml on NeuroNexus/radiens-drive-catalog

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page