Skip to main content

No project description provided

Project description

logo

Neural NMF

PyPI Version

This package is an implementation of Neural NMF, a method for detecting latent hierarchical structure in data based on non-negative matrix factorization, as presented in the paper "Neural Nonnegative Matrix Factorization for Hierarchical Multilayer Topic Modeling" by T. Will, R. Zhang, E. Sadovnik, M. Gao, J. Vendrow, J. Haddock, D. Molitor, and D. Needell (2020).

Neural NMF solve a hierarchical nonnegative matrix factorization problem by representing the problem with a neural network architecture and applying backpropagation methods. In the unsupervised case, Neural NMF applies backprogation directly to the given loss function (usually either Energy Loss or Reconstruction Loss). In the supervised case, Neural NMF adds a linear layer to the last S matrix to estimate the given labels.


Installation

To install Neural NMF, run this command in your terminal:

$ pip install NeuralNMF

This is the preferred method to install Neural NMF, as it will always install the most recent stable release.

If you don't have pip installed, these installation instructions can guide you through the process.

Usage

Quick Start

To use Neural NMF, we first initialize our neural network with the layer sizes, and if applicable, the number of classes. We give the layer sizes as a list, where the first element is the 2nd dimension of the input matrix and each following dimensions is the rank of the approximation at the following layer.

>>> import torch
>>> from NeuralNMF import Neural_NMF
>>> X = 10*torch.mm(torch.randn(100,5),torch.randn(5,20)) #produce random low rank data
>>> m, k1, k2, = X.shape[0], 10, 5
>>> net = Neural_NMF([m, k1, k2])

One we have initialized our network, we train it using the train function (See the documentation in train.py for specific details of every optional parameter).

>>> from NeuralNMF import train
>>> history = train(net, X, epoch=6, lr=500, supervised=False)
epoch =  1 
 tensor(485.2435, dtype=torch.float64)
epoch =  2 
 tensor(475.1584, dtype=torch.float64)
epoch =  3 
 tensor(461.2400, dtype=torch.float64)
epoch =  4 
 tensor(444.1705, dtype=torch.float64)
epoch =  5 
 tensor(430.4947, dtype=torch.float64)
epoch =  6 
 tensor(422.7317, dtype=torch.float64)

Citing

If you use our code in an academic setting, please consider citing our code by citing the following paper:

Will, T., Zhang, R., Sadovnik, E., Gao, M., Vendrow, J., Haddock, J., Molitor, D., & Needell, D. (2020). Neural nonnegative matrix factorization for hierarchical multilayer topic modeling.

Authors

  • Joshua Vendrow
  • Jamie Haddock

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NeuralNMF-1.0.0.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

NeuralNMF-1.0.0-py3-none-any.whl (15.4 kB view details)

Uploaded Python 3

File details

Details for the file NeuralNMF-1.0.0.tar.gz.

File metadata

  • Download URL: NeuralNMF-1.0.0.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.8

File hashes

Hashes for NeuralNMF-1.0.0.tar.gz
Algorithm Hash digest
SHA256 09d793d080e496bc088ef0fc2bad7cd62e35fc6e15dbb44a21d99ae136d00941
MD5 dedd09c8b0ba67d1f24f5db3894ccec7
BLAKE2b-256 1f6bfa7b1cd289ab9ce00135f6ba5e5654f79687225f3a9f8fa5a09a1e1a9ec4

See more details on using hashes here.

File details

Details for the file NeuralNMF-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: NeuralNMF-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 15.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.8

File hashes

Hashes for NeuralNMF-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ff63d312ccb5332cfefb6f423dfb904f13a75916efb24881a8ad281ccb5f0839
MD5 df564a37e918c7ce379f4dced40b0be4
BLAKE2b-256 89d06b78cc719d7151de7f1ae5b6f9051294e908abca1b9828d08e3abf43f1c9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page