fine-tune transformer-based models for named entity recognition
Project description
A python package to fine-tune transformer-based models for Named Entity Recognition (NER).
Resources
Source Code: https://github.com/af-ai-center/nerblackbox
Documentation: https://af-ai-center.github.io/nerblackbox
About
Transformer-based models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize the publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
Fine-tuning can be done in a few simple steps using an “experiment configuration file”
# cat <experiment_name>.ini
dataset_name = swedish_ner_corpus
pretrained_model_name = af-ai-center/bert-base-swedish-uncased
and either the Command Line Interface (CLI) or the Python API:
# CLI
nerbb run_experiment <experiment_name> # fine-tune
nerbb get_experiment_results <experiment_name> # get results/performance
nerbb predict <experiment_name> <text_input> # apply best model
# Python API
nerbb = NerBlackBox()
nerbb.run_experiment(<experiment_name>) # fine-tune
nerbb.get_experiment_results(<experiment_name>) # get results/performance
nerbb.predict(<experiment_name>, <text_input>) # apply best model
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | df413efb73d09d2e4a5a38defec77f179f6e3add1d8c95139339e51be3e61ca9 |
|
MD5 | d0689c0fd43737929da87cdae0d863d1 |
|
BLAKE2b-256 | 84c5835c2fecccaa04dac2e7a9cc6649a0bad20206a766378f5fe7f7a74fb676 |