Skip to main content

Add your description here

Project description

NTAC


This is a Python implementation of the Neuronal typing algorithm described in

Gregory Schwartzman, Ben Jourdan, David García-Soriano, Arie Matsliah. Connectivity Is All You Need: Inferring Neuronal Types with NTAC. bioRxiv. 2025

Neuronal Type Assignment from Connectivity

NTAC (Neuronal Type Assignment from Connectivity) groups neurons into cell types based solely on synaptic connectivity. It comes in two variants:

  • Seeded (semi-supervised): Requires a small fraction of neurons with known labels.
  • Unseeded (unsupervised): Requires no labels.

Installation:

Install NTAC with: pip install ntac

Install cudatoolkit with conda for optional speed for unseeded: conda install -c conda-forge cudatoolkit

Qickstart with NTAC:

import numpy as np
import scipy.sparse as sp
from ntac import Ntac, sbm, GraphData

# Generate an adjacency matrix and labels from an SBM graph 
A, labels = sbm(n=1000, k=4)



#NTAC requires a CSR array and labels as a string array
A_csr = sp.csr_array(A)
labels = np.array([str(l) for l in labels])
# NTAC can take in as input a CSR matrix and labels, 
# but it is easier to use the GraphData class for test_train split and metrics
data = GraphData(A_csr, labels=labels)

############################################
#Seeded NTAC
#use only 10% for training
train_indices, test_indices = data.test_train_split(train_size=0.1)
labels[test_indices] = "?" # "?" indicates a nodes is unlabeled
#Initialize NTAC with the data and labels
nt = Ntac(data=data, labels=labels)
# nt = Ntac(data=A_csr, labels=labels) # if you want to use CSR matrix directly
for i in range(5):
    print(f"Step {i}")
    nt.step()
    partition = nt.get_partition()
    #partition = nt.get_topk_partition(5) # if we want to get the top 5 labels for each node
    metrics = data.get_metrics(partition, test_indices, data.labels)
    print(f"Accuracy: {metrics['acc']:.3f}") #can also get ARI, weighted F1, and topk accuracy (if using get_topk_partition)

############################################
#Unseed NTAC example
print("Unseeded NTAC")
#This will ignore the labels, even when provided
nt.solve_unseeded(max_k = 4)
nt.map_partition_to_gt_labels(data.labels) #Use the Hungarian algorithm to map the partition to the ground truth labels
partition = nt.get_partition() #unseeded does not support topk partition
metrics = data.get_metrics(partition, range(data.n), data.labels)
print(f"Accuracy: {metrics['acc']:.3f} ARI: {metrics['ari']:.3f}", f"Weighted F1: {metrics['f1']:.3f}")

Documentation & Examples using the Flywire dataset:

Link to Docs and further examples

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ntac-0.1.1.tar.gz (293.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ntac-0.1.1-py3-none-any.whl (39.4 kB view details)

Uploaded Python 3

File details

Details for the file ntac-0.1.1.tar.gz.

File metadata

  • Download URL: ntac-0.1.1.tar.gz
  • Upload date:
  • Size: 293.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for ntac-0.1.1.tar.gz
Algorithm Hash digest
SHA256 becd645db90a5eeef9fa65f34855a951c4c1976bcb2e6e1c71b0a2a091c3d21b
MD5 9ab2ebab417768786551ce2851166e14
BLAKE2b-256 3ca2a039dcbce0889b71210878ce7d1737b2e74e5db696d9610040b93536db33

See more details on using hashes here.

File details

Details for the file ntac-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: ntac-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 39.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for ntac-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0c723e8e137b83fa7bc46d6da6f62e4e0e81b80f1f9448eea99a399deca7ed5e
MD5 bc4902925f2b1d633e175adbfd50a596
BLAKE2b-256 8d86f453dd54bd8b22ef7c3b6e688f6d741bf20a3b536b9c00094f3cf8a1ca60

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page