Skip to main content

Automatic hyperparameters tuning for topic models (ARTM approach) using evolutionary algorithms

Project description

Library scheme

AutoTM

Project Status: Active – The project has reached a stable, usable state and is being actively developed. build License PyPI version Documentation Status Downloads

:sparkles:News:sparkles: We have fully updated our framework to AutoTM 2.0 version enriched with new functionality! Stay tuned!

Automatic parameters selection for topic models (ARTM approach) using evolutionary and bayesian algorithms. AutoTM provides necessary tools to preprocess english and russian text datasets and tune additively regularized topic models.

What is AutoTM?

Topic modeling is one of the basic methods for a whole range of tasks:

  • Exploratory data analysis of unlabelled text data
  • Extracting interpretable features (topics and their combinations) from text data
  • Searching for hidden insights in the data

While ARTM (additive regularization for topic models) approach provides the significant flexibility and quality comparative or better that neural approaches it is hard to tune such models due to amount of hyperparameters and their combinations. That is why we provide optimization pipelines to efortlessly process custom datasets.

To overcome the tuning problems AutoTM presents an easy way to represent a learning strategy to train specific models for input corporas. We implement two strategy variants:

  • fixed-size variant, that provides a learning strategy that follow the best practices collected from the manual tuning history
Learning strategy representation (fixed-size)
  • graph-based variant with more flexibility and unfixed ordering and amount of stages (New in AutoTM 2.0). Example of pipeline is provided below:

Learning strategy representation (graph-based)

Optimization procedure is done by genetic algorithm (GA) which operators are specifically tuned for the each of the strategy creation variants (GA for graph-based is New in AutoTM 2.0). Bayesian Optimization is available only for fixed-size strategy.

To speed up the procedure AutoTM also contain surrogate modeling implementation for fixed-size and graph-based (New in AutoTM 2.0) learning strategies that, for some iterations, approximate fitness function to reduce computation costs on training topic models.

Library scheme

AutoTM also propose a range of metrics that can be used as fitness function, like classical ones as coherence to LLM-based (New in AutoTM 2.0).

Installation

! Note: The functionality of topic models training is available only for linux distributions.

Via pip:

pip install autotm

python -m spacy download en_core_web_sm

From source:

poetry install

python -m spacy download en_core_web_sm

Quickstart

Start with the notebook:

Running from the command line

To fit a model: autotmctl --verbose fit --config conf/config.yaml --in data/sample_corpora/sample_dataset_lenta.csv

To predict with a fitted model: autotmctl predict --in data/sample_corpora/sample_dataset_lenta.csv --model model.artm

Citation

@article{10.1093/jigpal/jzac019,
    author = {Khodorchenko, Maria and Butakov, Nikolay and Sokhin, Timur and Teryoshkin, Sergey},
    title = "{ Surrogate-based optimization of learning strategies for additively regularized topic models}",
    journal = {Logic Journal of the IGPL},
    year = {2022},
    month = {02},
    issn = {1367-0751},
    doi = {10.1093/jigpal/jzac019},
    url = {https://doi.org/10.1093/jigpal/jzac019},}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autotm-0.2.3.4.tar.gz (65.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

autotm-0.2.3.4-py3-none-any.whl (78.5 kB view details)

Uploaded Python 3

File details

Details for the file autotm-0.2.3.4.tar.gz.

File metadata

  • Download URL: autotm-0.2.3.4.tar.gz
  • Upload date:
  • Size: 65.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.1 CPython/3.9.12 Linux/6.5.0-0.rc7.20230821gitf7757129e3de.50.fc40.x86_64

File hashes

Hashes for autotm-0.2.3.4.tar.gz
Algorithm Hash digest
SHA256 a792347c2e3dd1e5570471c130882444c4557b4ff3311c06a3543835def9b746
MD5 7d2eec5c578919b452782772a8cca212
BLAKE2b-256 d2c7ce3647aa97c0f8593a6d800ccd16447b1e74b383a20bb993873d295d031a

See more details on using hashes here.

File details

Details for the file autotm-0.2.3.4-py3-none-any.whl.

File metadata

  • Download URL: autotm-0.2.3.4-py3-none-any.whl
  • Upload date:
  • Size: 78.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.1 CPython/3.9.12 Linux/6.5.0-0.rc7.20230821gitf7757129e3de.50.fc40.x86_64

File hashes

Hashes for autotm-0.2.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e30d4a64699cc8a4288e97bddaf18241b93a060c884d98734a78dc2782d7a75f
MD5 1c2b24578dc459c891dfdd44ff18b910
BLAKE2b-256 c37e94a9881acad835a43e18455e2dbaf340114591818a3d983371131f21f233

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page