Skip to main content

Easily train your own text-generating neural network of any size and complexity

Project description

Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code, or quickly train on a text using a pretrained model.

  • A modern neural network architecture which utilizes new techniques as attention-weighting and skip-embedding to accelerate training and improve model quality.
  • Able to train on and generate text at either the character-level or word-level.
  • Able to configure RNN size, the number of RNN layers, and whether to use bidirectional RNNs.
  • Able to train on any generic input text file, including large files.
  • Able to train models on a GPU and then use them with a CPU.
  • Able to utilize a powerful CuDNN implementation of RNNs when trained on the GPU, which massively speeds up training time as opposed to normal LSTM implementations.
  • Able to train the model using contextual labels, allowing it to learn faster and produce better results in some cases.
  • Able to generate text interactively for customized stories.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for textgenrnn, version 2.0.0
Filename, size File type Python version Upload date Hashes
Filename, size textgenrnn-2.0.0.tar.gz (1.7 MB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page