fine-tune transformer-based language models for named entity recognition
Project description
A python package to fine-tune transformer-based language models for Named Entity Recognition (NER).
Resources
Source Code: https://github.com/af-ai-center/nerblackbox
Documentation: https://af-ai-center.github.io/nerblackbox
About
Transformer-based language models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize Hugging Face’s publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
see documentation: https://af-ai-center.github.io/nerblackbox
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 839a8e4eac563e7007b382f266a48a1f54aca4874b0ccfd7f542b8028897860e |
|
MD5 | 33ef6d486a8651deb5e3832475e588bc |
|
BLAKE2b-256 | 4015818787488cd6856a89eddb3d331b7432d791968b16a476245433cb110382 |