Pretrained character-based neural network for easily generating text.
Project description
Easily train your own text-generating neural network of
any size and complexity on any text dataset with a few lines
of code, or quickly train on a text using a pretrained model.
* A modern neural network architecture which utilizes new techniques as
attention-weighting and skip-embedding to accelerate training
and improve model quality.
* Able to train on and generate text at either the
character-level or word-level.
* Able to configure RNN size, the number of RNN layers,
and whether to use bidirectional RNNs.
* Able to train on any generic input text file, including large files.
* Able to train models on a GPU and then use them with a CPU.
* Able to utilize a powerful CuDNN implementation of RNNs
when trained on the GPU, which massively speeds up training time as
opposed to normal LSTM implementations.
* Able to train the model using contextual labels,
allowing it to learn faster and produce better results in some cases.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
textgenrnn-1.2.1.tar.gz
(1.7 MB
view hashes)