Experiment on the Generative Pretrained Transformer 2 (GPT-2) for Language Modeling task using the PyTorch-Transformers library.
Project description
Next Word Prediction
Experiment on the Generative Pretrained Transformer 2 (GPT-2) for Language Modeling task using the PyTorch-Transformers library.
Installation
Requires python>=3.5, pytorch>=1.6.0, pytorch-transformers>=1.2.0
pip install next-word-prediction==0.1.5 pytorch-transformers==1.2.0 torch==1.6.0
How to use
>>> from next_word_prediction import GPT2
>>> gpt2 = GPT2()
>>> text = "The course starts next"
>>> gpt2.predict_next(text)
The course starts next ['week', 'to', 'month', 'year', 'Monday']
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for next_word_prediction-0.1.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 578b67d38d53d282c956ac01277c35455f29b09ec90289b1ba524f748c947f90 |
|
MD5 | 9897741d59892549f848c36af886b115 |
|
BLAKE2b-256 | 18e8d2c42247c6c983b3d93aa90a3244cb4c40dd50ac74c1b49ce8bbc8dcf05d |
Close
Hashes for next_word_prediction-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2fd39d181a2989ff515854529d734d26fa3235cbfabe384d30dae52f01f859d6 |
|
MD5 | 25b82b59607b695f9a37f1968adb9dae |
|
BLAKE2b-256 | 748aafb7233d1c5cf57257131f1eb5853817dbedaf972d8d8bf0ed1b00645528 |