Skip to main content

No project description provided

Project description

The Interlinked Module

This module is for neuroscientific computation and analysis with Python submodules and Rust-optimized kernels. This code is developed by Andrej Lozevski for the En Yang Lab at UNC Chapel Hill (2026).

Installation

This module can be installed by running:

pip install interlinked-lab

Submodules

All code is organized into the following submodules:

  • utils --- Contains functions for a wide range of uses
  • io --- Contains functions for I/O operations and streamlined file use
  • form --- Contains functions for manipulating labeled and non-labeled arrays
  • stats --- Contains functions for statistical tests and measures
  • info --- Contains functions for information theoretic tests and measures
  • draw --- Contains functions for data visualization
  • config --- Contains methods for updating, resetting, loading, and saving parameter settings for the module's default behaviors and arguments

Each submodule can be used in a program directly or through imports:

# Method 1
import interlinked
print(interlinked.config.defaults())

# Method 2
from interlinked import config
print(config.defaults())

All submodules and their functions/methods are listed below[^1]: [^1]: Only functions built for typical users are shown below. Additional functions not shown here are considered peripheral and should only be used after reading the source code.

Config

This submodule is a class instance, possessing the following methods and attributes:

Methods

  • def defaults(self)
    Lists all config default parameters
    returns: None

  • def list(self)
    Lists all current config parameters
    returns: None

  • def configure(self, **kwargs)
    Updates any specified config parameters
    Default parameters cannot be changed to a different type, but custom parameters have no restrictions
    returns: None

  • def reset(self)
    Resets config parameters to defaults
    returns: None

  • def save(self)
    Saves current config parameters in the temp directory
    Overwrites any pre-existing saved config parameters
    returns: None

  • def load(self)
    Loads config parameters saved in the temp directory
    If no saved parameters are found, throws an Exception and loads defaults
    returns: None

Attributes

  • TEMP_DIR: Path = /tmp/interlinked
    where the temp directory is located (DO NOT CHANGE)

  • TEMP_PREFIX: str = "__temp__"
    file prefix given to temporary files (DO NOT CHANGE)

  • TEMP_SUFFIX: str = ".dat"
    file suffix given to temporary files (DO NOT CHANGE)

  • CLEAR_TEMP: bool = False
    see io.check_temp() below

  • BATCH_SIZE: int = 1000
    size of batches for parallelism

  • NUM_WORKERS: int = 8
    number of workers for parallelism

  • NUM_BINS: int = 5
    number of bins for binning and digitization

  • NUM_KNNS: int = 8
    number of nearest neighbors for KNN operations

  • NUM_ITER: int = 1_000_000
    number of iterations for monte carlo simulation and bootstrap resampling operations

  • ALPHA1: float = 0.05
    α₁ for statistical tests

  • ALPHA2: float = 0.01
    α₂ for statistical tests

  • ALPHA3: float = 0.001
    α₃ for statistical tests

  • ALPHA4: float = 0.0001
    α₄ for statistical tests

  • RADIUS: int
    radius used for graph theory and nearest neighbors operations

  • MIN_SIZE: int
    minimum node count for graph theory operations

Any ALL-CAPS variable listed below can be assumed as a parameter found in the default config instance.

Utils

  • def digitize(x, n, dtype=np.int32)
    Digitizes an input array into discretized values from a given number of fixed-width bins
    x: ndarray (ndim of 1) --- input array to digitize
    n: int --- desired range of the output array
    dtype: dtype --- target dtype of output array
    returns: ndarray (ndim of 1)

  • def dff(raw, downsample=1, percentile=20, window=300)
    Calculates the ΔF/F of a calcium trace using a percentile filter and a sliding window
    raw: ndarray (ndim of 1) --- input array for which to calculate ΔF/F
    downsample: int --- downsampling factor (setting to 1 prevents downsampling)
    percentile: float --- percentile with which to calculate the baseline of the time series
    window: int --- sliding window size with which to calculate the baseline of the time series
    returns: ndarray (ndim of 1)

  • def divisor(arr, minimum=1, default_positive=True)
    Converts an input array into a safe divisor for array division, keeping sign and preventing unintentional mulitiplication
    arr: ndarray --- input array to convert
    minimum: float --- minimum magnitude allowed above or below 0 (prevents multiplication)
    default_positive: bool --- used to set any 0 in the input array to ±minimum
    returns: ndarray


IO

  • def find_file(path, pattern, allow_multiple=False)
    Returns a file from a directory with a specified glob pattern
    path: str | Path --- directory to search for the target file
    pattern: str --- glob pattern with which to search
    allow_multiple: bool --- whether to return an error if mulple files are found with the pattern
    returns: Path | list[Path]

  • def load_file(path, pattern, allow_pickle=False)
    Finds and loads data from a single .npy, .tif, .h5, or .hdf5 file from a directory with a specified glob pattern
    path: str | Path --- directory to search for the target file
    pattern: str --- glob pattern with which to search
    allow_pickle: bool --- if a .npy file is found, whether to allow pickling
    returns: file data (depends on file type)

  • def check_temp(clear=False)
    Checks if there are any temporary files in your temp directory
    clear: bool --- deletes any temporary files in the temp directory
    returns: None

  • def clear_temp(notify=True)
    Clears any temporary files in the temp directory
    notify: bool --- whether to log that files were cleared
    returns: None

  • class Memmap(shape, dtype)
    Streamlines handling of numpy memmap objects in the temp directory
    shape: tuple() --- shape of the stored array
    dtype: dtype --- dtype of the stored array

    • def save(self, data)
      Saves a memmap object in the temp directory
      data: ndarray --- numpy array to save into a memmap file
      returns: None

    • def load(self, read_only=True)
      Loads a memmap object's data
      read_only: bool --- whether the memmap is loaded with read or read/write permissions
      returns: numpy.Memmap

    • def delete(self)
      Deletes a memmap object from the temp directory
      returns: None

  • def load_fps(path)
    Loads the fps of from an xml file found in the specified directory
    path: str | Path --- directory to search for the target file
    returns: float

  • def load_resolution(path)
    Loads the (z,y,x) resolution from a txt file found in the specified directory
    path: str | Path --- directory to search for the target file
    returns: tuple(float, float, float)

  • def load_metadata(path)
    Loads the (t,z,y,x) resolution from txt and xml files found in the specified directory
    path: str | Path --- directory to search for the metadata
    returns: tuple(float, float, float, float)

  • def load_suite2p_data(path)
    Loads the labeled volume of all ROIs, cell activity traces, time-averaged brainmap, data shape, and ops from a Suite2p-containing directory
    Requires the existence of stat.npy, F.npy, and ops.npy files
    path: str | Path --- directory containing Suite2p files
    returns:

    • ndarray (ndim of 3) --- labeled volume (z,y,x)
    • ndarray (ndim of 2) --- cell traces (c,t)
    • ndarray (ndim of 3) --- brainmap (z,y,x)
    • tuple(int, int, int, int, int) --- data shape (Lc,Lt,Lz,Ly,Lx)
    • Suite2p Ops
  • def load_voluseg_data(path)
    Loads the labeled volume of all ROIs, cell activity traces, time-averaged brainmap, and data shape VoluSeg-containing directory
    Requires the existence of volume0.hdf5 and cells0_clean.hdf5 files
    path: str | Path --- directory containing VoluSeg files
    returns:

    • ndarray (ndim of 3) --- labeled volume (z,y,x)
    • ndarray (ndim of 2) --- cell traces (c,t)
    • ndarray (ndim of 3) --- brainmap (z,y,x)
    • tuple(int, int, int, int, int) --- data shape (Lc,Lt,Lz,Ly,Lx)
  • def load_combined_data(path)
    Loads the labeled volume of all ROIs, ROI activity traces, time-averaged brainmap, and data shape Combined-Segmentation-containing directory
    Requires the existence of a combined_segdata.h5 file
    Last Lc rows in ROI traces are Suite2p-identified cells. First (Lr - Lc) rows are VoluSeg-identified ROIs
    path: str | Path --- directory containing combined file
    returns:

    • ndarray (ndim of 3) --- labeled volume (z,y,x)
    • ndarray (ndim of 2) --- ROI traces (r,t)
    • ndarray (ndim of 3) --- brainmap (z,y,x)
    • tuple(int, int, int, int, int) --- data shape (Lr,Lc,Lt,Lz,Ly,Lx)
  • def build_trials(drift)
    Builds a trials-by-timepoints (Ln,Ltt) array of timepoint indices
    drift: ndarray (ndim of 1) --- drift time series, used to distinguish trials
    min_length: int --- length cutoff to distinguish go period from pulses
    returns: ndarray (ndim of 2)


Form

  • def form_volume(img, shape)
    Forms a volumetric array from a tiled image, using the specified target shape
    img: ndarray (ndim of 2) --- original tiled image
    shape: tuple(int, int, int) --- shape of target volume (z,y,x)
    returns: ndarray (ndim of 3)

  • def form_tiles(vol, shape)
    Forms a tiled image from a volume, using the specified target shape
    vol: ndarray (ndim of 3) --- original volume
    shape: tuple(int, int) --- shape of target tiled image
    returns: ndarray (ndim of 2)

  • def label_rois(stat, shape)
    Labels all ROIs in a volume using a stat.npy file
    stat: list[dict] --- list of pickled and sparse-labeled cells
    shape: tuple(int, int, int) --- shape of the target volume
    returns: ndarray (ndim of 3)

  • def adjust_rois(arr)
    Removes missing labels in a labeled array by shifting ROIs
    arr: ndarray --- labeled array
    returns:

    • ndarray --- relabled array
    • ndarray (ndim of 1) --- array of indices of the original unique labels
    • list[int] --- list of missing indices from the original array
  • def remove_rois(arr, rois, keep=False)
    Removes the specified labels from a labeled array, leaving missing labels
    arr: ndarray --- labeled array
    rois: list[int] | ndarray (ndim of 1) --- selected labels
    keep: bool --- if True, remove whatever rois are not selected, and if False, remove whatever rois are selected
    returns: ndarray

  • def weight_rois(rois, weights)
    Substitutes a labeled array with each label's corresponding weight
    rois: ndarray --- labeled array
    weights: list[float] | ndarray (ndim of 1) --- weights corresponding to each label
    returns: ndarray


Stats

  • def pearson_corr(x, y)
    Calculates the Pearson Correlation and its p-value for two variables
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    returns: tuple(float, float) --- ρ and p-value

  • def spearman_corr(x, y)
    Calculates the Spearman Rank Correlation and its p-value for two variables
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    returns: tuple(float, float) --- ρ and p-value

  • def phi_coef(x, y)
    Calculates the Phi Coefficient and its p-value for two binary variables
    x: ndarray (ndim of 1, bool)
    y: ndarray (ndim of 1, bool)
    returns: tuple(float, float) --- φ and p-value

  • def quantile_bins(x, n_bins=NUM_BINS)
    Calculates the bin edges for a specified number of quantile bins
    x: ndarray
    n_bins: int
    returns: ndarray (ndim of 1)


Info

  • def hist_H(x)
    Calculates the Shannon Entropy of a binned variable from a list of bin counts
    x: ndarray (ndim of 1)
    returns: float

  • def KL_H(x, k)
    Calculates the Shannon Entropy of a continuous variable using the Kozachenko-Leonenko estimator
    x: ndarray (ndim of 1) --- input array
    k: int --- number of nearest neighbors
    returns: float

  • def disc_MI(x, y, normalized=False, n_bins=NUM_BINS, bin_type=BIN_TYPE)
    Calculates the Mutual Information of two discrete variables
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    normalized: bool
    n_bins: int | list[int] --- number of bins with which to discretize x and y
    bin_type: str = 'fixed' | 'quantile' --- type of bins with which to discretize x and y
    returns: float

  • def disc_CMI(x, y, z, normalized=False, n_bins=NUM_BINS, bin_type=BIN_TYPE)
    Calculates the Conditional Mutual Information of two discrete variables conditioned on a third variable
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    z: ndarray (ndim of 1)
    normalized: bool
    n_bins: int | list[int] --- number of bins with which to discretize x, y, and z
    bin_type: str = 'fixed' | 'quantile' --- type of bins with which to discretize x, y, and z
    returns: float

  • def disc_II(x, y, z, normalized=False, n_bins=NUM_BINS, bin_type=BIN_TYPE)
    Calculates the Interaction Information of three discrete variables
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    z: ndarray (ndim of 1)
    normalized: bool
    n_bins: int | list[int] --- number of bins with which to discretize x, y, and z
    bin_type: str = 'fixed' | 'quantile' --- type of bins with which to discretize x, y, and z
    returns: float

  • def KSG_MI(x, y, k=NUM_BINS, normalized=False)
    Calculates the Mutual Information of two continuous variables using the Kraskov-Stoegbauer-Grassberger estimator
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    k: int
    normalized: bool
    returns: float

  • def KSG_CMI(x, y, z, normalized=False, n_bins=NUM_BINS, bin_type=BIN_TYPE)
    Calculates the Conditional Mutual Information of two continuous variables conditioned on a third variable using the Kraskov-Stoegbauer-Grassberger estimator
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    z: ndarray (ndim of 1)
    normalized: bool
    n_bins: int | list[int] --- number of bins with which to discretize x, y, and z
    bin_type: str = 'fixed' | 'quantile' --- type of bins with which to discretize x, y, and z
    returns: float

  • def KSG_II(x, y, z, normalized=False, n_bins=NUM_BINS, bin_type=BIN_TYPE)
    Calculates the Interaction Information of three discrete variables using the Kraskov-Stoegbauer-Grassberger estimator
    x: ndarray (ndim of 1)
    y: ndarray (ndim of 1)
    z: ndarray (ndim of 1)
    normalized: bool
    n_bins: int | list[int] --- number of bins with which to discretize x, y, and z
    bin_type: str = 'fixed' | 'quantile' --- type of bins with which to discretize x, y, and z
    returns: float


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

interlinked_lab-0.1.1-cp310-abi3-win_amd64.whl (251.4 kB view details)

Uploaded CPython 3.10+Windows x86-64

interlinked_lab-0.1.1-cp310-abi3-manylinux_2_34_x86_64.whl (423.0 kB view details)

Uploaded CPython 3.10+manylinux: glibc 2.34+ x86-64

interlinked_lab-0.1.1-cp310-abi3-macosx_11_0_arm64.whl (360.9 kB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

File details

Details for the file interlinked_lab-0.1.1-cp310-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for interlinked_lab-0.1.1-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 2d66742c0bb441fbdc871372fdade9342cf05256b41c31c0811dce8f7c6edafa
MD5 424df3118ea0454c1e25768bcd211a9e
BLAKE2b-256 c69d3cbec62b035e4a0cf3e9d2d18092c087cf82f93b8c0b8668fd65d610898f

See more details on using hashes here.

Provenance

The following attestation bundles were made for interlinked_lab-0.1.1-cp310-abi3-win_amd64.whl:

Publisher: release.yml on AndrejLozevski/interlinked

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file interlinked_lab-0.1.1-cp310-abi3-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for interlinked_lab-0.1.1-cp310-abi3-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 ba06d7c63114656476f7eb66d797935cb306f29607545dca6c503f779ac5984b
MD5 07356e254937373e89ac35a1e967df4d
BLAKE2b-256 fb273c404c6b0c3bdff8dbbdee0a74a319b234a5e0b51928770bad67cda20b0b

See more details on using hashes here.

Provenance

The following attestation bundles were made for interlinked_lab-0.1.1-cp310-abi3-manylinux_2_34_x86_64.whl:

Publisher: release.yml on AndrejLozevski/interlinked

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file interlinked_lab-0.1.1-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for interlinked_lab-0.1.1-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4c346525c63378d4c2c4044a7a5d1b597856fedc14bc29a09b45f89961649765
MD5 aec17435d3ff3f67f02c7f309b320ccf
BLAKE2b-256 1cefdf3eceddd80ae3b444e817748ccabcfcca824e4a1eb1a830beb4a4907540

See more details on using hashes here.

Provenance

The following attestation bundles were made for interlinked_lab-0.1.1-cp310-abi3-macosx_11_0_arm64.whl:

Publisher: release.yml on AndrejLozevski/interlinked

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page