RelBERT: the state-of-the-art lexical relation embedding model.
Project description
RelBERT
This is the official implementation of Distilling Relation Embeddings from Pre-trained Language Models (the camera-ready version of the paper will be soon available!) which has been accepted by the EMNLP 2021 main conference.
In the paper, we propose RelBERT, that is a lexical relation embedding model based on large scale pretrained masked language model. We release
TODO
- readme (huggingface model)
- sample usage
- cleanup unused parameters
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
relbert-0.0.0-py3.9.egg
(66.6 kB
view hashes)