fine-tune transformer-based language models for named entity recognition
Project description
A python package to fine-tune transformer-based language models for named entity recognition (NER).
Resources
Source Code: https://github.com/flxst/nerblackbox
Documentation: https://flxst.github.io/nerblackbox
About
Transformer-based language models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize Hugging Face’s publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
see documentation: https://flxst.github.io/nerblackbox
Citation
@misc{nerblackbox, author = {Stollenwerk, Felix}, title = {nerblackbox: a python package to fine-tune transformer-based language models for named entity recognition}, year = {2021}, url = {https://github.com/flxst/nerblackbox}, }
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.13-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6f89828300efec68d069d4a596d155cd8cd36ffae9b7d91b1fd2e84c23e3f251 |
|
MD5 | c174f421e4fcfb298714eee53497728b |
|
BLAKE2b-256 | c4566a59e8724e362a7393e08b06e8f290fec7bef93ff9d5b2e35fe94dd59d17 |