fine-tune transformer-based models for named entity recognition
Project description
A python package to fine-tune transformer-based models for Named Entity Recognition (NER).
Resources
Source Code: https://github.com/af-ai-center/nerblackbox
Documentation: https://af-ai-center.github.io/nerblackbox
About
Transformer-based models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize the publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
Fine-tuning can be done in a few simple steps using an “experiment configuration file”
# cat <experiment_name>.ini
dataset_name = swedish_ner_corpus
pretrained_model_name = af-ai-center/bert-base-swedish-uncased
and either the Command Line Interface (CLI) or the Python API:
# CLI
nerbb run_experiment <experiment_name> # fine-tune
nerbb get_experiment_results <experiment_name> # get results/performance
nerbb predict <experiment_name> <text_input> # apply best model
# Python API
nerbb = NerBlackBox()
nerbb.run_experiment(<experiment_name>) # fine-tune
nerbb.get_experiment_results(<experiment_name>) # get results/performance
nerbb.predict(<experiment_name>, <text_input>) # apply best model
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.7-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7d797718515033760f70defb3295ebf53f013f58d40447ea73af7b4b66172aa4 |
|
MD5 | 9561d03a0d70ea7549da5c30db714794 |
|
BLAKE2b-256 | a182fc8251cad4d74de6f0fa244ec759a51067cb28b592a43ff4f985a6dbe672 |