Skip to main content

SurpriseMeMore is a python module providing methods, based on the surprise framework, to detect mesoscale structures(e.g. communities, core-periphery, bow-tie) in graphs and multigraphs.

Project description

SurpriseMeMore

SurpriseMeMore is a toolbox for detecting mesoscale structure in networks, released as a python3 module.

SurpriseMeMore provides the user with a variety of solvers, based on the surprise framework, for the detection of mesoscale structures ( e.g. communities, core-periphery) in networks.

The models implemented in SurpriseMeMore are presented in a forthcoming paper on arXiv. If you use the module for your scientific research, please consider citing us:

    @misc{marchese2021detecting,
      title={Detecting mesoscale structures by surprise}, 
      author={Emiliano Marchese and Guido Caldarelli and Tiziano Squartini},
      year={2021},
      eprint={2106.05055},
      archivePrefix={arXiv},
      primaryClass={physics.soc-ph}
    }

Table Of Contents

Currently Implemented Methods

The available methods, for both directed and undirected networks, are:

  • Community detection on binary networks
  • Community detection on weighted networks with integer weights
  • Community detection on weighted networks with continuous weights
  • Core-Peryphery detection on binary networks
  • Core-Peryphery detection on weighted networks with integer weights

Installation

SurpriseMeMore can be installed via pip. You can get it from your terminal:

    $ pip install surprisememore

If you already install the package and wish to upgrade it, you can simply type from your terminal:

    $ pip install surprisememore --upgrade

Dependencies

NEMtropy uses numba library. It is installed automatically with surprisememore. If you use python3.5 you may incur in an error, we suggest installing numba with the following command:

    $ pip install --prefer-binary numba

It avoids an error during the installation of llvmlite due to the absence of its wheel in python3.5.

Some Examples

As an example, we run community detection on zachary karate club network.

    import numpy as np
    import networkx as nx
    from surprisememore import UndirectedGraph

    from surprisememore import UndirectedGraph
    import networkx as nx
    
    G = nx.karate_club_graph()
    adj_kar = nx.to_numpy_array(G)
    graph = UndirectedGraph(adj_kar)
    
    graph.run_discrete_community_detection(weighted=False,
                                           num_sim=2)

The algorithm will find the best partition by optimizing surprise score function. At the end of the optimization process, the optimal partition is saved as an attribute of the graph class.

    # optimal partition
    graph.solution
    
    # Surprise of the optimal partition
    graph.surprise
    
    # Log surprise
    graph.log_surprise

Similarly, we can run the algorithm detecting bimodular structure. In the case of zachary karate club, the code snippet is the following.

#%% md

    from surprisememore import UndirectedGraph
    import networkx as nx
    
    G = nx.karate_club_graph()
    adj_kar = nx.to_numpy_array(G)
    graph = UndirectedGraph(adjacency=adj_kar)

Here we initialized our SupriseMeMore UndirectedGraph object with the adjacency matrix. The available options are adjacency matrix or edgelist.

  • If you use adjacency matrix, then you have to pass the matrix as a numpy.ndarray;

  • If you use edgelist, then the edgelist has to be passed as a list of tuple:

    • [(u, v), (u, t), ...] for binary networks;
    • [(u, v, w1), (u, t, w2), ...] for weighted networks;

For more details about edgelist format you can see link.

    graph.run_discrete_cp_detection(weighted=False, num_sim=2)

In the previous example I passed two optional arguments to the function: weighted and num_sim. The argument weighted specify which version of surprise you want to use: binary or weighted. If the network is binary, you don't need to pass the argument "weighted", the class detects by itself that the graph is binary and use the proper method for community/bimodular detection. Instead, if the network has weights, the default method is the weighted one. To run binary community/bimodular detection you must specify "weighted"=False.

The arguments num_sim specifies the number of time we run over all the edges of the network during the optimization problem. You can find more detail about the algorithm in 1, 2.

All the implemented algorithms are heuristic, we suggest running them more than once and pick the best solution (the one with higher log_surprise).

To learn more, please read the two ipython notebooks in the examples' directory: one is a study case on a community detection, while the other is on an bimodular detection.

Development

Please work on a feature branch and create a pull request to the development branch. If necessary to merge manually do so without fast-forward:

    $ git merge --no-ff myfeature

To build a development environment run:

    $ python3 -m venv venv 
    $ source venv/bin/activate 
    $ pip install -e '.[dev]'

Credits

Author:

Emiliano Marchese (a.k.a. EmilianoMarchese)

Acknowledgements:

The module was developed under the supervision of Tiziano Squartini.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

surprisememore-0.1.0.tar.gz (23.4 kB view details)

Uploaded Source

File details

Details for the file surprisememore-0.1.0.tar.gz.

File metadata

  • Download URL: surprisememore-0.1.0.tar.gz
  • Upload date:
  • Size: 23.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.3

File hashes

Hashes for surprisememore-0.1.0.tar.gz
Algorithm Hash digest
SHA256 17d20aacac582915ff1b90cbf4f6a5745359b8d2855ff597fee6e41874c32705
MD5 6694e7fc6af39d8d413953c766024497
BLAKE2b-256 9331448f6c300aa4cd943933929da46437f13d355c9dbe05429642a431c80ce4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page