Skip to main content

General deep learing utility library

Project description

Deep Zensols Deep Learning Framework

PyPI Python 3.10 Python 3.11 Build Status

This deep learning library was designed to provide consistent and reproducible results.

Features:

  • Easy to configure and framework to allow for programmatic debugging of neural networks.
  • Reproducibility of results
    • All random seed state is persisted in the trained model files.
    • Persisting of keys and key order across train, validation and test sets.
  • Analysis of results with complete metrics available.
  • A vectorization framework that allows for pickling tensors.
  • Additional layers:
    • Full BiLSTM-CRF and stand-alone CRF implementation using easy to configure constituent layers.
    • Easy to configure N [deep convolution layer] with automatic dimensionality calculation and configurable pooling and batch centering.
    • Convolutional layer factory with dimensionality calculation.
    • Recurrent layers that abstracts RNN, GRU and LSTM.
    • N deep linear layers.
    • Each layer's configurable with activation, dropout and batch normalization.
  • Pandas integration to data load, easily manage vectorized features, and report results.
  • Multi-process for time consuming CPU feature vectorization requiring little to no coding.
  • Resource and tensor deallocation with memory management.
  • Real-time performance and loss metrics with plotting while training.
  • Thorough unit test coverage.
  • Debugging layers using easy to configure Python logging module and control points.
  • A workflow and API to package and distribute models. Then automatically download, install and inference with them in (optionally) two separate code bases.

Much of the code provides convenience functionality to PyTorch. However, there is functionality that could be used for other deep learning APIs.

Documentation

See the full documentation.

Obtaining

The easiest way to install the command line program is via the pip installer:

pip3 install zensols.deeplearn

Binaries are also available on pypi.

Workflow

This package provides a workflow for processing features, training and then testing a model. A high level outline of this process follows:

  1. Container objects are used to represent and access data as features.
  2. Instances of data points wrap the container objects.
  3. Vectorize the features of each data point in to tensors.
  4. Store the vectorized tensor features to disk so they can be retrieved quickly and frequently.
  5. At train time, load the vectorized features in to memory and train.
  6. Test the model and store the results to disk.

To jump right in, see the examples section. However, it is better to peruse the in depth explanation with the Iris example code follows:

  • The initial data processing, which includes data representation to batch creation.
  • Creating and configuring the model.
  • Using a facade to train, validate and test the model.
  • Analysis of results, including training/validation loss graphs and performance metrics.

Examples

The Iris example (also see the Iris example configuration) is the most basic example of how to use this framework. This example is detailed in the workflow documentation in detail.

There are also examples in the form of Juypter notebooks as well, which include the:

Attribution

This project, or example code, uses:

Corpora used include:

Torch CRF

The CRF class was taken and modified from Kemal Kurniawan's pytorch_crf GitHub repository. See the README.md module documentation for more information. This module was forked pytorch_crf with modifications. However, the modifications were not merged and the project appears to be inactive.

Important: This project will change to use it as a dependency pending merging of the changes needed by this project. Until then, it will remain as a separate class in this project, which is easier to maintain as the only class/code is the CRF class.

The pytorch_crf repository uses the same license as this repository, which the MIT License. For this reason, there are no software/package tainting issues.

See Also

The zensols deepnlp project is a deep learning utility library for natural language processing that aids in feature engineering and embedding layers that builds on this project.

Citation

If you use this project in your research please use the following BibTeX entry:

@inproceedings{landes-etal-2023-deepzensols,
    title = "{D}eep{Z}ensols: A Deep Learning Natural Language Processing Framework for Experimentation and Reproducibility",
    author = "Landes, Paul  and
      Di Eugenio, Barbara  and
      Caragea, Cornelia",
    editor = "Tan, Liling  and
      Milajevs, Dmitrijs  and
      Chauhan, Geeticka  and
      Gwinnup, Jeremy  and
      Rippeth, Elijah",
    booktitle = "Proceedings of the 3rd Workshop for Natural Language Processing Open Source Software (NLP-OSS 2023)",
    month = dec,
    year = "2023",
    address = "Singapore, Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.nlposs-1.16",
    pages = "141--146"
}

Changelog

An extensive changelog is available here.

Community

Please star the project and let me know how and where you use this API. Contributions as pull requests, feedback and any input is welcome.

License

MIT License

Copyright (c) 2020 - 2023 Paul Landes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

zensols.deeplearn-1.12.0-py3-none-any.whl (173.6 kB view details)

Uploaded Python 3

File details

Details for the file zensols.deeplearn-1.12.0-py3-none-any.whl.

File metadata

File hashes

Hashes for zensols.deeplearn-1.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 03c8d10e5c5f50cd6f6b8f5f1d5d2f927ce41c6113e9da5689b6af47e71e9ea7
MD5 bc437de73d9e4ef4eb8a88a771a21f50
BLAKE2b-256 2618e245e1cba9303bb4575a5ce48d5363887362bc226ee50b9f159579e9485f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page