Skip to main content

Implementation of Gaussian LDA topic model, with efficiency tricks

Project description

Gaussian LDA

Another implementation of the paper Gaussian LDA for Topic Models with Word Embeddings.

This is a Python implementation based as closely as possible on the Java implementation released by the paper's authors.

Installation

You'll first need to install the choldate package, following its installation instructions. (It's not possible to include this as a dependency for the PyPi package.)

Then install gaussianlda using Pip:

pip install gaussianlda

Usage

The package provides two classes for training Gaussian LDA:

  • Cholesky only, gaussianlda.GaussianLDATrainer: Simple Gibbs sampler with optional Cholesky decomposition trick.
  • Cholesky+aliasing, gaussianlda.GaussianLDAAliasTrainer: Cholesky decomposition (not optional) and the Vose aliasing trick.

The trainer is prepared by instantiating the training class:

  • corpus: List of documents, where each document is a list of int IDs of words. These are IDs into the vocabulary and the embeddings matrix.
  • vocab_embeddings: (V, D) Numpy array, where V is the number of words in the vocabulary and D is the dimensionality of the embeddings.
  • vocab: Vocabulary, given as a list of words, whose position corresponds to the indices using in the data. This is not strictly needed for training, but is used to output topics.
  • num_tables: Number of topics to learn.
  • alpha, kappa: Hyperparameters to the doc-topic Dirichlet and the inverse Wishart prior
  • save_path: Path to write the model out to after each iteration.
  • mh_steps (aliasing only): Number of Montecarlo-Hastings steps for each topic sample.

Then you set the sampler running for a specified number of iterations over the training data by calling trainer.sample(num_iters).

Example

import numpy as np
from gaussianlda import GaussianLDAAliasTrainer

# A small vocabulary as a list of words
vocab = "money business bank finance sheep cow goat pig".split()
# A random embedding for each word
# Really, you'd want to load something more useful!
embeddings = np.random.sample((8, 100), dtype=np.float32)
corpus = [
    [0, 2, 1, 1, 3, 0, 6, 1],
    [3, 1, 1, 3, 7, 0, 1, 2],
    [7, 5, 4, 7, 7, 4, 6],
    [5, 6, 1, 7, 7, 5, 6, 4],
]
# Prepare a trainer
trainer = GaussianLDAAliasTrainer(
    corpus, embeddings, vocab, 2, 0.1, 0.1
)
# Set training running
trainer.sample(10)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gaussianlda-0.2.12.tar.gz (55.7 kB view details)

Uploaded Source

Built Distribution

gaussianlda-0.2.12-py3-none-any.whl (61.9 kB view details)

Uploaded Python 3

File details

Details for the file gaussianlda-0.2.12.tar.gz.

File metadata

  • Download URL: gaussianlda-0.2.12.tar.gz
  • Upload date:
  • Size: 55.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.9

File hashes

Hashes for gaussianlda-0.2.12.tar.gz
Algorithm Hash digest
SHA256 941cf92e7befef4fca24a92e1e959bc74fba6687d1d987e4a1ff526d74543d31
MD5 8bedc196fabead36a9ff30a4dcd9f235
BLAKE2b-256 f2b61a26eba3bd9ffd98ea7f3a5025a7cb793e3b70b710cee4b3d98badffb50a

See more details on using hashes here.

File details

Details for the file gaussianlda-0.2.12-py3-none-any.whl.

File metadata

  • Download URL: gaussianlda-0.2.12-py3-none-any.whl
  • Upload date:
  • Size: 61.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.9

File hashes

Hashes for gaussianlda-0.2.12-py3-none-any.whl
Algorithm Hash digest
SHA256 953bbe7eef0ececd919d877df74c731d133ab9a97b53f12bf083aa85dd24999c
MD5 d88b949158fa0a216e6664a609644c1c
BLAKE2b-256 9490c6d28f335719045466cfa200a261770633bbbc77903a6a1598755e43c6c8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page