Skip to main content

PyTorch implementations of KMeans, Soft-KMeans and Constrained-KMeans which can be run on GPU and work on (mini-)batches of data.

Project description

pyversions wheel Latest Version ReadTheDocs torch_kmeans-logo

torch_kmeans

PyTorch implementations of KMeans, Soft-KMeans and Constrained-KMeans

torch_kmeans features implementations of the well known k-means algorithm as well as its soft and constrained variants.

All algorithms are completely implemented as PyTorch modules and can be easily incorporated in a PyTorch pipeline or model. Therefore, they support execution on GPU as well as working on (mini-)batches of data. Moreover, they also provide a scikit-learn style interface featuring

model.fit(), model.predict() and model.fit_predict()

functions.

-> view official documentation

Highlights

  • Fully implemented in PyTorch. (PyTorch and Numpy are the only package dependencies!)

  • GPU support like native PyTorch.

  • PyTorch script JIT compiled for most performance sensitive parts.

  • Works with mini-batches of samples:
    • each instance can have a different number of clusters.

  • Constrained Kmeans works with cluster constraints like:
    • a max number of samples per cluster or,

    • a maximum weight per cluster, where each sample has an associated weight.

  • SoftKMeans is a fully differentiable clustering procedure and can readily be used in a PyTorch neural network model which requires backpropagation.

  • Unit tested against the scikit-learn KMeans implementation.

  • GPU execution enables very fast computation even for large batch size or very high dimensional feature spaces (see speed comparison)

Installation

Simply install from PyPI

pip install torch-kmeans

Usage

Pytorch style usage

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
result = model(x)
print(result.labels)

Scikit-learn style usage

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
model = model.fit(x)
labels = model.predict(x)
print(labels)

or

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
labels = model.fit_predict(x)
print(labels)

Examples

You can find more examples and usage in the detailed example notebooks.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_kmeans-0.1.3.tar.gz (74.1 kB view details)

Uploaded Source

Built Distribution

torch_kmeans-0.1.3-py3-none-any.whl (24.2 kB view details)

Uploaded Python 3

File details

Details for the file torch_kmeans-0.1.3.tar.gz.

File metadata

  • Download URL: torch_kmeans-0.1.3.tar.gz
  • Upload date:
  • Size: 74.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.12

File hashes

Hashes for torch_kmeans-0.1.3.tar.gz
Algorithm Hash digest
SHA256 0807ecea595161baa1c08f1bdcb42b66dbb4d5b31dc10cfb5767b1d02e4ef6a2
MD5 acc14ab3bd0f699f59245479803c2376
BLAKE2b-256 d246c6ebea33f322fa2033046ec3e1f884ce5f2c6f92c2718800bd6dae3d5a6e

See more details on using hashes here.

File details

Details for the file torch_kmeans-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_kmeans-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0e09b4e2faa27b1bb69f301814d73905383921f470db29f410b7dda1f7d578c9
MD5 f8893dc5c104a2700e8b22c51e7e7056
BLAKE2b-256 5a3def2937c98ee119cf2201754d7577953baf2c96131e78e02350331da683bd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page