Skip to main content

A fast hierarchical dimensionality reduction algorithm.

Project description

Build Status Documentation Status PyPI Downloads

h-NNE: Hierarchical Nearest Neighbor Embedding

Great news, we have a new version of the algorithm 🥳🥳🥳

h-NNE v2

🎉 New in h-NNE v2: a fast, geometry-aware packing layer for the top (coarse) levels that allocates space proportional to cluster size and preserves local anchor relations. This fixes the “dotifying” look (tiny, collapsed clusters) and the “squeezed big cluster” issue that could appear in v1 when many anchors sat too close.

What changed & why

  • Space by cluster mass. Coarse-level anchors are laid out via a lightweight circle-packing step guided by PCA anchors and \(1\)-NN relations to produce acompact, overlap-free arrangement. Larger clusters get more area; small ones remain visible.

  • Better global framing. A vectorized overlap resolver plus relaxed/capped \(k\)-NN edge targets yields tight layouts without blow-ups, reducing unused whitespace and artifacts.

  • Drop-in for h-NNE. v2 runs on the top few FINCH levels, then hands the result to the standard h-NNE refinement. Deeper (fine) levels still use the original fast point-to-anchor updates.

  • Note: When running v2 on large datasets (>= 1M points), it starts the tree layout by default after some level of FINCH with a minimum number of clusters (10-10,000) to enhance point spread.

Using it

  • v2 is on by default. To use the legacy pipeline:

    HNNE(..., hnne_version="v1")
  • Defaults generally work well. Optional advanced knobs (e.g., start_cluster_view) let you steer the look: choose a deeper level for a more uniform, zoomed-in spread, or a coarser (top) level for a global, zoomed-out view before refinement.

More technical details (algorithms, complexity, limitations) are in our arXiv paper (linked soon).

h-NNE (overview)

h-NNE is a hierarchical dimensionality reduction method—akin in spirit to t-SNE/UMAP—but designed for speed, simplicity, and structured views. It first builds a clustering hierarchy with FINCH, then embeds cluster anchors level-by-level and maps points to their anchors with simple, vectorized updates. This preserves local neighborhoods while providing a natural coarse→fine “zoom” over the data.

Key properties

  • Fast & scalable. No global nonconvex optimization; point updates are vectorized and memory-friendly.

  • Structure-aware zoom. Choose a parent level for global context, then expand children for detail.

  • Labels included. FINCH provides cluster labels out of the box—useful for unlabeled data and structured visualization.

  • Parameter-light. FINCH is parameter-free; h-NNE uses robust defaults.

See our corresponding CVPR 2022 paper for the original algorithm and the arXiv addendum for the v2 packing layer.

M. Saquib Sarfraz*, Marios Koulakis*, Constantin Seibold, Rainer Stiefelhagen. Hierarchical Nearest Neighbor Graph Embedding for Efficient Dimensionality Reduction. CVPR 2022.

More details are available in the project documentation.

Installation

The project is available in PyPI. To install run:

pip install hnne

How to use h-NNE

The HNNE class implements the common methods of the sklearn interface.

Simple projection example

Below a dataset of dimensionality 256 is projected to 2 dimensions.

import numpy as np
from hnne import HNNE

data = np.random.random(size=(1000, 256))

hnne = HNNE(n_components=2)
projection = hnne.fit_transform(data)
Projecting on new points

Once a dataset has been projected, one can apply the transform and project new points to the same dimension.

hnne = HNNE()
projection = hnne.fit_transform(data)

new_data_projection = hnne.transform(new_data)
Demos

The following demo notebooks are available:

  1. Basic Usage

  2. Multiple Projections

  3. Clustering for Free

  4. Monitor Quality of Network Embeddings

Citation

If you make use of this project in your work, it would be appreciated if you cite the hnne paper:

@article{hnne,
  title={Hierarchical Nearest Neighbor Graph Embedding for Efficient Dimensionality Reduction},
  author={M. Saquib Sarfraz, Marios Koulakis, Constantin Seibold, Rainer Stiefelhagen},
  booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year = {2022}
}

If you make use of the clustering properties of the algorithm please also cite:

 @inproceedings{finch,
   author    = {M. Saquib Sarfraz and Vivek Sharma and Rainer Stiefelhagen},
   title     = {Efficient Parameter-free Clustering Using First Neighbor Relations},
   booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
   pages = {8934--8943},
   year  = {2019}
}
Contributing

Contributions are very welcome :-) Please check the contributions guide for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hnne-2.0.1.tar.gz (37.3 kB view details)

Uploaded Source

File details

Details for the file hnne-2.0.1.tar.gz.

File metadata

  • Download URL: hnne-2.0.1.tar.gz
  • Upload date:
  • Size: 37.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.9

File hashes

Hashes for hnne-2.0.1.tar.gz
Algorithm Hash digest
SHA256 d852d67bc3f3ddbbbd891e2a9326feff68a177f2a47f7aeb85fa1affa9813765
MD5 5f6da6ea4833b9ab1b2d9dc40c20f84e
BLAKE2b-256 71ba8703c02173b93687a8af53b419988b8b54ed5b0b88ebaa536a3ef33a0e2f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page