Skip to main content

Pretrained character-based neural network for easily generating text.

Project description


Generate text using a pretrained neural network with a few lines of code,
or easily train your own text-generating neural network of any size
and complexity on any text dataset.

* A modern neural network architecture which utilizes new techniques as
attention-weighting and skip-embedding to accelerate training
and improve model quality.
* Able to train on and generate text at either the
character-level or word-level.
* Able to configure RNN size, the number of RNN layers,
and whether to use bidirectional RNNs.
* Able to train on any generic input text file, including large files.
* Able to train models on a GPU and then use them with a CPU.
* Able to utilize a powerful CuDNN implementation of RNNs
when trained on the GPU, which massively speeds up training time as
opposed to normal LSTM implementations.
* Able to train the model using contextual labels,
allowing it to learn faster and produce better results in some cases.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textgenrnn-1.2.tar.gz (1.7 MB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page