Graph Partitioning Algorithms
Project description
Graph Partitioning
Graph Partitioning is an age-old problem with applications in various domains, such as routing vehicles for delivery and finding the right target for immunizations to control a pandemic. Graph Partitioning involves partitioning a graph’s vertices into roughly equal-sized subsets such that the total edge cost spanning the subsets is at most k. In this package we have implemented three major algorithms -
Authors
Graph Convolution Networks (GCN)
Graph Convolution Networks use neural networks on structured graphs. Graph convolutions are generalizations of convolutions and are easier to apply in the spectral domain. Graph Convolutional Networks (GCN) which can use both - graph and node feature information. This python implementation is mostly inspired from a paper wiritten by Thomas N. Kipf and Max Welling. Paper
Spectral Clustering
The spectral clustering method is defined for general weighted graphs; it identifies K clusters using the eigenvectors of a matrix.
Constrained K-Means Clustering
K-means clustering implementation whereby a minimum and/or maximum size for each cluster can be specified.
This K-means implementation modifies the cluster assignment step (E in EM) by formulating it as a Minimum Cost Flow (MCF) linear network optimisation problem. This is then solved using a cost-scaling push-relabel algorithm and uses Google's Operations Research tools' SimpleMinCostFlow which is a fast C++ implementation.
Installation
You can install the graph-partition from PyPI:
pip install graph-partition
How to Use
Primarily there are three major algorithms are there
- Graph Convolutional Neural Network
- Spectral Clustering
- Constrained K-Means Clustering
Using of Graph Convolutional Network
import urllib.request
from scipy.spatial import distance_matrix
from graph_partition import *
# Artificial test Data
url = "https://cs.joensuu.fi/sipu/datasets/s1.txt"
data = urllib.request.urlopen(url)
ds = []
for line in data:
ds.append([float(x) for x in line.strip().split()])
# Calculating the Const Matrix
cost_matrix = distance_matrix(ds[:50], ds[:50])
# Defining the GCN Model
gcn_model = GraphConvolutionNetwork(
cost_matrix=cost_matrix,
num_class=2,
hidden_layers=10
)
# GCN fit and predict
gcn_model.fit()
# Printing the cluster label
print(gcn_model.cluster_label)
# Printing the cluster evaluation metrics
print(gcn_model.evaluation_metric)
Using of Spectral Clustering
import urllib.request
from scipy.spatial import distance_matrix
from graph_partition import *
# Artificial test Data
url = "https://cs.joensuu.fi/sipu/datasets/s1.txt"
data = urllib.request.urlopen(url)
ds = []
for line in data:
ds.append([float(x) for x in line.strip().split()])
# Calculating the Const Matrix
cost_matrix = distance_matrix(ds[:50], ds[:50])
# Defining Spectral Clustering Model
sc_model = SpectralClustering(cost_matrix=cost_matrix, num_class=2)
# Printing the cluster labels and evaluation metrics
print(sc_model.cluster_label)
print(sc_model.evaluation_metric)
Using of Constrained K-Means Clustering
import urllib.request
from graph_partition import *
# Artificial test Data
url = "https://cs.joensuu.fi/sipu/datasets/s1.txt"
data = urllib.request.urlopen(url)
ds = []
for line in data:
ds.append([float(x) for x in line.strip().split()])
# Defining Spectral Clustering Model
k_means_model = ConstrainedKMeans(
design_matrix=ds[:50],
num_class=2,
evaluate_metric=True
)
# Printing the cluster labels and evaluation metrics
print(k_means_model.cluster_label)
print(k_means_model.evaluation_metric)
Support
For any queries please contact via email
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file graph_partition-0.1.16.tar.gz
.
File metadata
- Download URL: graph_partition-0.1.16.tar.gz
- Upload date:
- Size: 22.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.14 CPython/3.9.13 Darwin/21.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a872250cb7f78c5ab0eb5e8430a7b10364907dfabfb713ce036863ad16617528 |
|
MD5 | 13e77fe26c52a0655fa50e1e14fe1105 |
|
BLAKE2b-256 | f8a1f0f1a5d4e767b7647035a10e10923ea0677b20fb66fce1c409256ee5cd33 |
File details
Details for the file graph_partition-0.1.16-py3-none-any.whl
.
File metadata
- Download URL: graph_partition-0.1.16-py3-none-any.whl
- Upload date:
- Size: 15.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.14 CPython/3.9.13 Darwin/21.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a964444f7fdaba0c7b737249ef17976ae2104ac7af00723b0beeeaa97c71dfbc |
|
MD5 | c90ea51a5c4773858b9bfcee5d14839d |
|
BLAKE2b-256 | ea8c7eaeac96f282d0e26e659136d83e22010bd7bd06f4c116a8e8b161a0d42f |