Skip to main content

A small python package designed to calculate the small-world propensity of a weighted, undirected network. Translated from the MATLAB version featured in Muldoon et al.

Project description

Small World Propensity

DOI GitHub release PyPI pyversions Code style: black Imports: isort

This python package was adapted from the MATLAB package as first presented in Small-World Propensity and Weighted Brain Networks (2016) by Sarah Feldt Muldoon, Eric W. Bridgeford & Danielle S. Bassett. Their original MATLAB implementation can be found here.

Use

The small-world propensity package can be installed using pip

python -m pip install small-world-propensity

small_world_propensity can be called in two ways: either with a single adjacency matrix, or with a list of adjacency matrices and a boolean list denoting whether each matrix is binary or not. In either case, small_world_propensity will return a pandas dataframe similar to the following: Dataframe

Generation of regular and random matrices

Using the structural network of the cat cortex obtained from tract-tracing studies between 52 brain regions, we can visualize the process behind the calculation of the small-world propensity, $\phi$. The matrix is loaded using

cat = sio.loadmat('data/cat.mat')['CIJctx']

We can then ensure symmetry by calling

symm_cat = swp.make_symmetric(cat)

In order to get the regular version of the cat matrix, we first find the effective average radius:

r = swp.get_avg_rad_eff(symm_cat)
cat_reg = swp.regular_matrix_generator(symm_cat, r)

Finally we produce the randomized cat matrix:

cat_rand = swp.randomize_matrix(cat_symm)

Cat matrices

The graphs visualized in a circular layout look as follows:

Cat graphs

Comparison of $\phi$ in real networks

We can take the networks used in Muldoon et al and plot $\phi$, $\Delta_L$, $\Delta_C$, and $\delta$. Note that these networks are not the exact same as the ones used in Muldoon et al, and due to differences in how Numpy performs permutations, and the use of NetworkX and iGraph libraries, the results are not identical, but still match closely.

The adjacency matrices: Adjacency matrices

And the results: Summary

To cite this work, please use:

@software{small-world-propensity,
  author       = {{Daniels, R. K.}},
  title        = {small-world-propensity},
  year         = 2023,
  publisher    = {Zenodo},
  version      = {v0.0.8},
  doi          = {10.5281/zenodo.10299681},
  url          = {https://github.com/rkdan/small-world-propensity}
}

Please also cite the authors of the original MATLAB implementation:

@article{Muldoon2016,
    author = "Muldoon, Sarah Feldt and Bridgeford, Eric W. and Bassett, Danielle S.",
    title = "{Small-World Propensity and Weighted Brain Networks}",
    doi = "10.1038/srep22057",
    journal = "Scientific Reports",
    volume = "6",
    number = "1",
    pages = "P07027",
    year = "2016"
}

[!NOTE]
This software has a GNU AGPL license. If this license is inadequate for your use, please get in touch.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

small_world_propensity-0.0.19.tar.gz (17.2 kB view details)

Uploaded Source

Built Distribution

small_world_propensity-0.0.19-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file small_world_propensity-0.0.19.tar.gz.

File metadata

File hashes

Hashes for small_world_propensity-0.0.19.tar.gz
Algorithm Hash digest
SHA256 5bb1dadc86fd1155633e5c1f3e70cb46f0041819023758b03d27e24b9796afd0
MD5 c9f00c7d9fd85a374a6d94d68dcdd0b2
BLAKE2b-256 0c74108ec2f66e032fe2d0e40304e0e74a92ad995533600a4010f8dc91e125c1

See more details on using hashes here.

File details

Details for the file small_world_propensity-0.0.19-py3-none-any.whl.

File metadata

File hashes

Hashes for small_world_propensity-0.0.19-py3-none-any.whl
Algorithm Hash digest
SHA256 7262f4f8d8d9406c0fc8cba65cda59a9917c66ecc0180c74a4cfe5d8e07e0bb0
MD5 c9f2b6de0669f7724fd0bf4bb6d185bb
BLAKE2b-256 3d36221f53bec76bf2a4f03832b4566a1a8e0dfd5c96a2d2cb8f546bd53ec107

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page