Easily train your own text-generating neural network of any size and complexity
Project description
Easily train your own text-generating neural network of
any size and complexity on any text dataset with a few lines
of code, or quickly train on a text using a pretrained model.
- A modern neural network architecture which utilizes new techniques as
attention-weighting and skip-embedding to accelerate training
and improve model quality.
- Able to train on and generate text at either the
character-level or word-level.
- Able to configure RNN size, the number of RNN layers,
and whether to use bidirectional RNNs.
- Able to train on any generic input text file, including large files.
- Able to train models on a GPU and then use them with a CPU.
- Able to utilize a powerful CuDNN implementation of RNNs
when trained on the GPU, which massively speeds up training time as
opposed to normal LSTM implementations.
- Able to train the model using contextual labels,
allowing it to learn faster and produce better results in some cases.
- Able to generate text interactively for customized stories.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
textgenrnn-1.4.tar.gz
(1.7 MB
view details)
File details
Details for the file textgenrnn-1.4.tar.gz.
File metadata
- Download URL: textgenrnn-1.4.tar.gz
- Upload date:
- Size: 1.7 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: Python-urllib/3.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
585cdeaffa38798bace91d689f93fe70234846f19c74c2faa7b3c289ceabe643
|
|
| MD5 |
f926cb41343f4ba7980fe5cc95bd2402
|
|
| BLAKE2b-256 |
7ed68fb904517987818e18799295a3c1129d1856cf808c3cc77dc9197755cd59
|