Skip to main content

Easily train your own text-generating neural network of any size and complexity

Project description


Easily train your own text-generating neural network of
any size and complexity on any text dataset with a few lines
of code, or quickly train on a text using a pretrained model.

- A modern neural network architecture which utilizes new techniques as
attention-weighting and skip-embedding to accelerate training
and improve model quality.
- Able to train on and generate text at either the
character-level or word-level.
- Able to configure RNN size, the number of RNN layers,
and whether to use bidirectional RNNs.
- Able to train on any generic input text file, including large files.
- Able to train models on a GPU and then use them with a CPU.
- Able to utilize a powerful CuDNN implementation of RNNs
when trained on the GPU, which massively speeds up training time as
opposed to normal LSTM implementations.
- Able to train the model using contextual labels,
allowing it to learn faster and produce better results in some cases.
- Able to generate text interactively for customized stories.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
textgenrnn-1.4.tar.gz (1.7 MB) Copy SHA256 hash SHA256 Source None Aug 8, 2018

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page