fine-tune transformer-based language models for named entity recognition
Project description
A python package to fine-tune transformer-based language models for named entity recognition (NER).
Resources
Source Code: https://github.com/af-ai-center/nerblackbox
Documentation: https://af-ai-center.github.io/nerblackbox
About
Transformer-based language models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize Hugging Face’s publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
see documentation: https://af-ai-center.github.io/nerblackbox
Citation
@misc{nerblackbox, author = {Stollenwerk, Felix}, title = {nerblackbox: a python package to fine-tune transformer-based language models for named entity recognition}, year = {2021}, url = {https://github.com/af-ai-center/nerblackbox}, }
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.11-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ff04c7b064c45bac4226d031b94684c8bae6c0af64195844f3c4d4f0b08f26da |
|
MD5 | 4f3590ec3a49e325229df6b431d8fa30 |
|
BLAKE2b-256 | e9a56e3442dbd551ca3c6a2ce204a4a1b9ad6c0f1a3be2f4d1c9620ee26e7b71 |