(Personalized) Page-Rank computation using PyTorch
Project description
torch-ppr
This package allows calculating page-rank and personalized page-rank via power iteration with PyTorch, which also supports calculation on GPU (or other accelerators).
💪 Getting Started
As a simple example, consider this simple graph with five nodes.
Its edge list is given as
>>> edge_index = torch.as_tensor(data=[(0, 1), (1, 2), (1, 3), (2, 4)]).t()
We can use
>>> page_rank(edge_index)
tensor([0.1269, 0.3694, 0.2486, 0.1269, 0.1281])
to calculate the page rank, i.e., a measure of global importance. We notice that the central node receives the largest importance score, while all other nodes have equal importance.
We can also calculate personalized page rank which measures importance
from the perspective of a single node.
For instance, for node 2
, we have
>>> personalized_page_rank(edge_index=edge_index, indices=[2])
tensor([[0.1103, 0.3484, 0.2922, 0.1103, 0.1388]])
By the virtue of using PyTorch, the code seamlessly works on GPUs, too, and
supports auto-grad differentiation. Moreover, the calculation of personalized
page rank supports automatic batch size optimization via
torch_max_mem
.
🚀 Installation
The most recent code and data can be installed directly from GitHub with:
$ pip install git+https://github.com/mberr/torch-ppr.git
👐 Contributing
Contributions, whether filing an issue, making a pull request, or forking, are appreciated. See CONTRIBUTING.md for more information on getting involved.
👋 Attribution
⚖️ License
The code in this package is licensed under the MIT License.
🍪 Cookiecutter
This package was created with @audreyfeldroy's cookiecutter package using @cthoyt's cookiecutter-snekpack template.
🛠️ For Developers
See developer instructions
The final section of the README is for if you want to get involved by making a code contribution.
Development Installation
To install in development mode, use the following:
$ git clone git+https://github.com/mberr/torch-ppr.git
$ cd torch-ppr
$ pip install -e .
🥼 Testing
After cloning the repository and installing tox
with pip install tox
, the unit tests in the tests/
folder can be
run reproducibly with:
$ tox
Additionally, these tests are automatically re-run with each commit in a GitHub Action.
📖 Building the Documentation
The documentation can be built locally using the following:
$ git clone git+https://github.com/mberr/torch-ppr.git
$ cd torch-ppr
$ tox -e docs
$ open docs/build/html/index.html
The documentation automatically installs the package as well as the docs
extra specified in the setup.cfg
. sphinx
plugins
like texext
can be added there. Additionally, they need to be added to the
extensions
list in docs/source/conf.py
.
📦 Making a Release
After installing the package in development mode and installing
tox
with pip install tox
, the commands for making a new release are contained within the finish
environment
in tox.ini
. Run the following from the shell:
$ tox -e finish
This script does the following:
- Uses Bump2Version to switch the version number in the
setup.cfg
,src/torch_ppr/version.py
, anddocs/source/conf.py
to not have the-dev
suffix - Packages the code in both a tar archive and a wheel using
build
- Uploads to PyPI using
twine
. Be sure to have a.pypirc
file configured to avoid the need for manual input at this step - Push to GitHub. You'll need to make a release going with the commit where the version was bumped.
- Bump the version to the next patch. If you made big changes and want to bump the version by minor, you can
use
tox -e bumpversion minor
after.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file torch_ppr-0.0.1.tar.gz
.
File metadata
- Download URL: torch_ppr-0.0.1.tar.gz
- Upload date:
- Size: 21.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d48e3e8ea31709a45e8124e9d4795e3ee4296ad0f5f08050ef1c424631166b44 |
|
MD5 | 0f565d35370d0210fca23e2ea8a80c54 |
|
BLAKE2b-256 | 763cb0d09b895980075c4f39b577be87bc971bbad7f6417b9db1786a43a53566 |
File details
Details for the file torch_ppr-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: torch_ppr-0.0.1-py3-none-any.whl
- Upload date:
- Size: 11.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cb1b9b3fcda940eb653f410090ad4f3fe383b73848c3e2cff3ae554abb3dad58 |
|
MD5 | 6c9199f4e28c9d44441e8a52828d46c2 |
|
BLAKE2b-256 | c72e2524e9f4465e89bf5ce4a8c76b44acd7105fed8ee87daa618cd733f2a63c |