Skip to main content

(Personalized) Page-Rank computation using PyTorch

Project description

torch-ppr

Tests PyPI PyPI - Python Version PyPI - License Documentation Status Codecov status Cookiecutter template from @cthoyt Code style: black Contributor Covenant

This package allows calculating page-rank and personalized page-rank via power iteration with PyTorch, which also supports calculation on GPU (or other accelerators).

💪 Getting Started

As a simple example, consider this simple graph with five nodes.

Its edge list is given as

>>> import torch
>>> edge_index = torch.as_tensor(data=[(0, 1), (1, 2), (1, 3), (2, 4)]).t()

We can use

>>> from torch_ppr import page_rank
>>> page_rank(edge_index=edge_index)
tensor([0.1269, 0.3694, 0.2486, 0.1269, 0.1281])

to calculate the page rank, i.e., a measure of global importance. We notice that the central node receives the largest importance score, while all other nodes have lower importance. Moreover, the two indistinguishable nodes 0 and 3 receive the same page rank.

We can also calculate personalized page rank which measures importance from the perspective of a single node. For instance, for node 2, we have

>>> from torch_ppr import personalized_page_rank
>>> personalized_page_rank(edge_index=edge_index, indices=[2])
tensor([[0.1103, 0.3484, 0.2922, 0.1103, 0.1388]])

Thus, the most important node is the central node 1, nodes 0 and 3 receive the same importance value which is below the value of the direct neighbor 4.

By the virtue of using PyTorch, the code seamlessly works on GPUs, too, and supports auto-grad differentiation. Moreover, the calculation of personalized page rank supports automatic batch size optimization via torch_max_mem.

🚀 Installation

The most recent release can be installed from PyPI with:

$ pip install torch_ppr

The most recent code and data can be installed directly from GitHub with:

$ pip install git+https://github.com/mberr/torch-ppr.git

👐 Contributing

Contributions, whether filing an issue, making a pull request, or forking, are appreciated. See CONTRIBUTING.md for more information on getting involved.

👋 Attribution

⚖️ License

The code in this package is licensed under the MIT License.

🍪 Cookiecutter

This package was created with @audreyfeldroy's cookiecutter package using @cthoyt's cookiecutter-snekpack template.

🛠️ For Developers

See developer instructions

The final section of the README is for if you want to get involved by making a code contribution.

Development Installation

To install in development mode, use the following:

$ git clone git+https://github.com/mberr/torch-ppr.git
$ cd torch-ppr
$ pip install -e .

🥼 Testing

After cloning the repository and installing tox with pip install tox, the unit tests in the tests/ folder can be run reproducibly with:

$ tox

Additionally, these tests are automatically re-run with each commit in a GitHub Action.

📖 Building the Documentation

The documentation can be built locally using the following:

$ git clone git+https://github.com/mberr/torch-ppr.git
$ cd torch-ppr
$ tox -e docs
$ open docs/build/html/index.html

The documentation automatically installs the package as well as the docs extra specified in the setup.cfg. sphinx plugins like texext can be added there. Additionally, they need to be added to the extensions list in docs/source/conf.py.

📦 Making a Release

After installing the package in development mode and installing tox with pip install tox, the commands for making a new release are contained within the finish environment in tox.ini. Run the following from the shell:

$ tox -e finish

This script does the following:

  1. Uses Bump2Version to switch the version number in the setup.cfg, src/torch_ppr/version.py, and docs/source/conf.py to not have the -dev suffix
  2. Packages the code in both a tar archive and a wheel using build
  3. Uploads to PyPI using twine. Be sure to have a .pypirc file configured to avoid the need for manual input at this step
  4. Push to GitHub. You'll need to make a release going with the commit where the version was bumped.
  5. Bump the version to the next patch. If you made big changes and want to bump the version by minor, you can use tox -e bumpversion minor after.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_ppr-0.0.8.tar.gz (22.8 kB view details)

Uploaded Source

Built Distribution

torch_ppr-0.0.8-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file torch_ppr-0.0.8.tar.gz.

File metadata

  • Download URL: torch_ppr-0.0.8.tar.gz
  • Upload date:
  • Size: 22.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.10

File hashes

Hashes for torch_ppr-0.0.8.tar.gz
Algorithm Hash digest
SHA256 c08dfa075ec8cd5ef562fb26c7d7a7ea78680c750a3191cdc5a89066622c9dd1
MD5 ddaa9e3797068f828ecd1c184289e8b9
BLAKE2b-256 6359ec5bdce39629903b14ce9464bfcb06ec2f647d5d69b66abb15f6f5b3bb36

See more details on using hashes here.

File details

Details for the file torch_ppr-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: torch_ppr-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.10

File hashes

Hashes for torch_ppr-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 3ebb9352a40ff2e2a988c52b0a67f212e88371fa23079048c96de5f944a87170
MD5 9716f3d263ffaeb4dafc141db7a78a76
BLAKE2b-256 f7fca6fb38cc8ceb04e4bf06cb42d3723261c0fc1e3ae43ef0d7a01fd71e77c3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page