Experiment on the Generative Pretrained Transformer 2 (GPT-2) for Language Modeling task using the PyTorch-Transformers library.
Project description
Next Word Prediction
Experiment on the Generative Pretrained Transformer 2 (GPT-2) for Language Modeling task using the PyTorch-Transformers library.
Installation
Requires python>=3.5, pytorch>=1.6.0, pytorch-transformers>=1.2.0
pip install next-word-prediction
How to use
>>> from next_word_prediction import GPT2
>>> gpt2 = GPT2()
>>> text = "The course starts next"
>>> gpt2.predict_next(text)
The course starts next ['week', 'to', 'month', 'year', 'Monday']
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for next_word_prediction-0.1.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 57934c9fbac1b4924d86d2c75c7d318dcfe7a23dc586898fcac4b702d18d7914 |
|
MD5 | 01ea3e07934c1bffa43d238ef9f04897 |
|
BLAKE2b-256 | 6fe6f2497f83aa73ff3f58cd17b66643992eeae70c0f8670ebb2e7603ffd9c94 |
Close
Hashes for next_word_prediction-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f5b26671a2f62d9661cde5bf0f6fdff18da40a5faec5e910d8be9e47cd0c5370 |
|
MD5 | f73f7c95343d5b2f86000500baf7c759 |
|
BLAKE2b-256 | c275735e6402bca00f903fe95dde9645d0f967db6897a9086e18909b5d471648 |