Skip to main content

PyTorch implementations of KMeans, Soft-KMeans and Constrained-KMeans which can be run on GPU and work on (mini-)batches of data.

Project description

pyversions wheel Latest Version ReadTheDocs torch_kmeans-logo

torch_kmeans

PyTorch implementations of KMeans, Soft-KMeans and Constrained-KMeans

torch_kmeans features implementations of the well known k-means algorithm as well as its soft and constrained variants.

All algorithms are completely implemented as PyTorch modules and can be easily incorporated in a PyTorch pipeline or model. Therefore, they support execution on GPU as well as working on (mini-)batches of data. Moreover, they also provide a scikit-learn style interface featuring

model.fit(), model.predict() and model.fit_predict()

functions.

-> view official documentation

Highlights

  • Fully implemented in PyTorch. (PyTorch and Numpy are the only package dependencies!)

  • GPU support like native PyTorch.

  • PyTorch script JIT compiled for most performance sensitive parts.

  • Works with mini-batches of samples:
    • each instance can have a different number of clusters.

  • Constrained Kmeans works with cluster constraints like:
    • a max number of samples per cluster or,

    • a maximum weight per cluster, where each sample has an associated weight.

  • SoftKMeans is a fully differentiable clustering procedure and can readily be used in a PyTorch neural network model which requires backpropagation.

  • Unit tested against the scikit-learn KMeans implementation.

  • GPU execution enables very fast computation even for large batch size or very high dimensional feature spaces (see speed comparison)

Installation

Simply install from PyPI

pip install torch-kmeans

Usage

Pytorch style usage

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
result = model(x)
print(result.labels)

Scikit-learn style usage

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
model = model.fit(x)
labels = model.predict(x)
print(labels)

or

import torch
from torch_kmeans import KMeans

model = KMeans(n_clusters=4)

x = torch.randn((4, 20, 2))   # (BS, N, D)
labels = model.fit_predict(x)
print(labels)

Examples

You can find more examples and usage in the detailed example notebooks.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_kmeans-0.2.0.tar.gz (74.5 kB view details)

Uploaded Source

Built Distribution

torch_kmeans-0.2.0-py3-none-any.whl (24.6 kB view details)

Uploaded Python 3

File details

Details for the file torch_kmeans-0.2.0.tar.gz.

File metadata

  • Download URL: torch_kmeans-0.2.0.tar.gz
  • Upload date:
  • Size: 74.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.12

File hashes

Hashes for torch_kmeans-0.2.0.tar.gz
Algorithm Hash digest
SHA256 1a07614ca499147110a58135155ab27b06f1ee9b0bb32d0f74ddadf03788da29
MD5 daa6fb762d3e542f4846725aac567c5c
BLAKE2b-256 4053025447c9c725f2f83d194606ee8c6867a71975d7e393697956e2907521a9

See more details on using hashes here.

File details

Details for the file torch_kmeans-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for torch_kmeans-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6b6140d08b2b2ff5db9b5f8501a2d37d6f90197c2aa804f94545db3f8474a2ca
MD5 7319eb44b89c70c0526c89d1e54fc41d
BLAKE2b-256 bc210748ecc11e7dfa741178a373d8814841eba8cf81fcf8f0f3dd5a56c9a91e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page