Skip to main content

An Explainable Deep Network for Dimension Reduction (EVNet)

Project description

DMT: An Explainable Deep Network for Dimension Reduction

The code includes the following modules:

  • Training
  • Inference
  • Comparison with t-SNE, UMAP and PCA

Requirements

  • torch>=2.3.1
  • torchaudio>=2.3.1
  • torchvision>=0.18.1
  • pytorch-lightning==2.4.0

Installation

Create a new conda environment and install torch, torchvision, torchaudio:

conda create -n DMT python=3.10
conda activate DMT
pip install torch==2.3.1 torchvision==0.18.1 torchaudio==2.3.1 --index-url https://download.pytorch.org/whl/cu121

Then you can install the package from source or from PyPI. Install from source:

pip install -e git+https://github.com/Westlake-AI/DMT-learn.git#egg=dmt-learn

Install from PyPI:

pip install dmt-learn

Running the code

Use the following code to fit the model to the dataset and visualize the results.

import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris
from dmt import DMT

# Load sample dataset
iris = load_iris()
X = iris.data
y = iris.target

# Perform t-SNE
dmt = DMT(num_fea_aim=100)
X_dmt = dmt.fit_transform(X)

# Plot the result
plt.figure(figsize=(8, 6))
scatter = plt.scatter(X_dmt[:, 0], X_dmt[:, 1], c=y, cmap='viridis')

# Create legend
legend1 = plt.legend(*scatter.legend_elements(), title="Classes")
plt.gca().add_artist(legend1)  # Add the legend to the current axes

plt.title('DMT visualization of Iris dataset')
plt.xlabel('DMT Component 1')
plt.ylabel('DMT Component 2')
plt.savefig('dmt.png')

You can alse separate the training and inference steps:

dmt.fit(X)
X_dmt = dmt.transform(X)

If you want to compare the results with other dimension reduction methods(t-SNE, UMAP), you can use the following code:

dmt.compare(X, "comparison.png")

Cite the paper

@article{zang2023evnet,
  title={Evnet: An explainable deep network for dimension reduction},
  author={Zang, Zelin and Cheng, Shenghui and Lu, Linyan and Xia, Hanchen and Li, Liangyu and Sun, Yaoting and Xu, Yongjie and Shang, Lei and Sun, Baigui and Li, Stan Z},
  journal={IEEE Transactions on Visualization and Computer Graphics},
  year={2023},
  publisher={IEEE}
}

License

DMT is released under the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

dmt_learn-0.0.8-py3-none-any.whl (63.6 kB view details)

Uploaded Python 3

File details

Details for the file dmt_learn-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: dmt_learn-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 63.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for dmt_learn-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 bf5b6b8ba516dccdf2e0b63258660c647165c8c8539f968b02038656d3ff1698
MD5 8c7a23cc7a4f0e4b6750dca479e2dd6b
BLAKE2b-256 6c111b9f3773f63ad224a5c949e5ad165763811f332b3e80ca8fdde654350717

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page