Skip to main content

An experimental parallel backend for NetworkX

Project description

nx-parallel

nx-parallel is a NetworkX backend that uses joblib for parallelization. This project aims to provide parallelized implementations of various NetworkX functions to improve performance. Refer NetworkX backends documentation to learn more about the backend architecture in NetworkX.

Algorithms in nx-parallel

Script used to generate the above list
import _nx_parallel as nxp
d = nxp.get_funcs_info() # temporarily add `from .update_get_info import *` to _nx_parallel/__init__.py
for func in d:
    print(f"- [{func}]({d[func]['url']})")

Installation

It is recommended to first refer the NetworkX's INSTALL.rst. nx-parallel requires Python >=3.10. Right now, the only dependencies of nx-parallel are networkx and joblib.

Install the released version

You can install the stable version of nx-parallel using pip:

$ pip install nx-parallel

The above command also installs the two main dependencies of nx-parallel i.e. networkx and joblib. To upgrade to a newer release use the --upgrade flag:

$ pip install --upgrade nx-parallel

Install the development version

Before installing the development version, you may need to uninstall the standard version of nx-parallel and other two dependencies using pip:

$ pip uninstall nx-parallel networkx joblib

Then do:

$ pip install git+https://github.com/networkx/nx-parallel.git@main

Backend usage

You can run your networkx code by just setting the NETWORKX_AUTOMATIC_BACKENDS environment variable to parallel:

$ export NETWORKX_AUTOMATIC_BACKENDS=parallel && python nx_code.py

Note that for all functions inside nx_code.py that do not have an nx-parallel implementation their original networkx implementation will be executed. You can also use the nx-parallel backend in your code for only some specific function calls in the following ways:

import networkx as nx
import nx_parallel as nxp

G = nx.path_graph(4)
H = nxp.ParallelGraph(G)

# method 1 : passing ParallelGraph object in networkx function (Type-based dispatching)
nx.betweenness_centrality(H)

# method 2 : using the 'backend' kwarg
nx.betweenness_centrality(G, backend="parallel")

# method 3 : using nx-parallel implementation with networkx object
nxp.betweenness_centrality(G)

# method 4 : using nx-parallel implementation with ParallelGraph object
nxp.betweenness_centrality(H)

# output : {0: 0.0, 1: 0.6666666666666666, 2: 0.6666666666666666, 3: 0.0}

Notes

  1. Some functions in networkx have the same name but different implementations, so to avoid these name conflicts at the time of dispatching networkx differentiates them by specifying the name parameter in the _dispatchable decorator of such algorithms. So, method 3 and method 4 are not recommended. But, you can use them if you know the correct name. For example:

    # using `name` parameter - nx-parallel as an independent package
    nxp.all_pairs_node_connectivity(H) # runs the parallel implementation in `connectivity/connectivity`
    nxp.approximate_all_pairs_node_connectivity(H) # runs the parallel implementation in `approximation/connectivity`
    

    Also, if you are using nx-parallel as a backend then mentioning the subpackage to which the algorithm belongs is recommended to ensure that networkx dispatches to the correct implementation. For example:

    # with subpackage - nx-parallel as a backend
    nx.all_pairs_node_connectivity(H)
    nx.approximation.all_pairs_node_connectivity(H)
    
  2. Right now there isn't much difference between nx.Graph and nxp.ParallelGraph so method 3 would work fine but it is not recommended because in the future that might not be the case.

Feel free to contribute to nx-parallel. You can find the contributing guidelines here. If you'd like to implement a feature or fix a bug, we'd be happy to review a pull request. Please make sure to explain the changes you made in the pull request description. And feel free to open issues for any problems you face, or for new features you'd like to see implemented.

This project is managed under the NetworkX organisation, so the code of conduct of NetworkX applies here as well.

All code in this repository is available under the Berkeley Software Distribution (BSD) 3-Clause License (see LICENSE).

Thank you :)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nx_parallel-0.2.tar.gz (1.6 MB view details)

Uploaded Source

Built Distribution

nx_parallel-0.2-py3-none-any.whl (26.2 kB view details)

Uploaded Python 3

File details

Details for the file nx_parallel-0.2.tar.gz.

File metadata

  • Download URL: nx_parallel-0.2.tar.gz
  • Upload date:
  • Size: 1.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for nx_parallel-0.2.tar.gz
Algorithm Hash digest
SHA256 c1098436b440815cb4246e5592db9cc60dc9c912ead22289a0586afff0c39db7
MD5 55c247e5b26f745c8ca5b598c54f0791
BLAKE2b-256 b75847140e51be74bb08061d2d37fec09e7bf28c1acb9f2600ac646c448bb083

See more details on using hashes here.

File details

Details for the file nx_parallel-0.2-py3-none-any.whl.

File metadata

  • Download URL: nx_parallel-0.2-py3-none-any.whl
  • Upload date:
  • Size: 26.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for nx_parallel-0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 76cab259812ee3b46c0aa3cbfc8a003c0f1b5972decae3873fd97ff4cf891f6e
MD5 62beab62e3fd2a18bb588f1acbe4af3a
BLAKE2b-256 31c3748ac9497593ba18dba65397eab1e496d54fb70d0c425e2db0573a4bb86a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page