SRL Transformer model
Project description
Semantic Role Lableing with BERT
Semantic Role Labeling based on AllenNLP implementation of Shi et al, 2019. It uses VerbAatlas inventory and it's trained also on predicate disambiguation, in addition to arguments identification and disambiguation.
- Language Model: BERT
- Dataset: CoNLL 2012
Results
With bert-base-cased
:
# Dev set
- F1 arguments 87.6
- F1 predicates 95.5
# Test set
- F1 arguments x
- F1 predicates x
With bert-base-multilingual-cased
:
# Dev set
- F1 arguments 86.2
- F1 predicates 94.2
# Test set
- F1 arguments 86.1
- F1 predicates 94.9
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file transformer_srl-2.0.dev20202507.tar.gz
.
File metadata
- Download URL: transformer_srl-2.0.dev20202507.tar.gz
- Upload date:
- Size: 93.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d001a9adde85845c4f73f8d1fa9b67bef0b6bd1a661a0a6799a27700bfd606b7 |
|
MD5 | f5a98d33257456515bd864bb94722498 |
|
BLAKE2b-256 | 696c57a0604a2d7cb21564577c9ef8685c02c8c40b8c5c1e6d406f1bde127d8a |