A library to evaluate faithfulness (factual correctness or consistency) of abstractive summaries.
Project description
Faithfulness 😇
An easy-to-use library to evaluate faithfulness (factual correctness) of abstractive summaries. Faithfulness is computed by comparing a summary with its original source document.
This library includes multiple faithfulness metrics based on:
- BERTScore
- Entailment
- Question Generation & Question Answering framework (QGQA)
- Named Entity Recognition (NER)
- Open Information Extraction (OpenIE)
- Semantic Role Labeling (SRL)
- Sentence Similarity (SentSim)
Installation ⚙️
$ conda create -n my_project python=3.8
This creates a new virtual environment for your project with conda. You can activate it with$ conda activate my_project
.$ conda install pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=10.1 -c pytorch
Please install PyTorch by following the instructions here. Make sure to install the CUDA variant that matches the CUDA version of your GPU.$ pip install faithfulness
This installs the faithfulness library and it's dependencies. Read more about the dependencies below.
All faithfulness metrics are model-based. Some models have to be installed manually:
- Download the SRL model here and save it in your project. e.g. /models/srl_model.tar.gz
- Download a spacy model:
$ python -m spacy download en_core_web_sm
- Download CoreNLP:
import stanza && stanza.install_corenlp()
Usage 🔥
from faithfulness.QGQA import QGQA
qgqa = QGQA()
summary = "Lorem ipsum dolor sit amet"
source = "Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam ..."
faithfulness, info = qgqa.score(summary, source)
More examples can be found here 💯.
Evaluation 📊
We evaluated all faithfulness metrics by correlating them with human judgements on the XSUM dataset (link). You will soon be able to read more about the evaluation in our paper. (Master's thesis)
Method | Pearson (r) | Spearman (p) |
---|---|---|
🥇 BERTScore | 0.501 | 0.486 |
🥈 Entailment | 0.366 | 0.422 |
🥉 SentSim | 0.392 | 0.389 |
SRL | 0.393 | 0.377 |
NER | 0.252 | 0.259 |
QGQA | 0.228 | 0.258 |
OpenIE | 0.169 | 0.185 |
Dependencies 🔗
By running $ pip install faithfulness
you will install this library as well as the following dependencies:
- 🤗 transformers
- spaCy (used for Entailment, NER, QGQA, SentSim, SRL)
- Stanza (used for OpenIE)
- AllenNLP (used for SRL)
- SentenceTransformers (used for NER, OpenIE, QGQA, SentSim, SRL)
Troubleshooting 🛠
There are currently problems when installing allennlp and jsonnet. If you encounter "Building wheel for jsonnet (setup.py) ... error" during the installation please try:
apt-get install make
apt-get install g++
or install jsonnet before installing this library
conda install -c conda-forge jsonnet
pip install faithfulness
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file faithfulness-0.0.4.tar.gz
.
File metadata
- Download URL: faithfulness-0.0.4.tar.gz
- Upload date:
- Size: 37.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 999112d6f0ee195258240abd554587dc76b29c999667643b725e24121f21225b |
|
MD5 | 467c3d0f9e25a8f6b903f5f961eb37bb |
|
BLAKE2b-256 | bb0edd0c1c11aa7614de3d87e671c68fc726be43f058461461273b3e33ac8693 |
File details
Details for the file faithfulness-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: faithfulness-0.0.4-py3-none-any.whl
- Upload date:
- Size: 49.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a298237847dac0adfe84ae22443ccdceaa404fec7f0405aa27de45f58b1c0f6c |
|
MD5 | 4b484ef2fb52a21eba7ff4a1ba02368a |
|
BLAKE2b-256 | 3f71ff9cb42ee1537755b74ecabcc57726333657a9d8793727174d7a8571cbb5 |