fine-tune transformer-based language models for named entity recognition
Project description
A python package to fine-tune transformer-based language models for Named Entity Recognition (NER).
Resources
Source Code: https://github.com/af-ai-center/nerblackbox
Documentation: https://af-ai-center.github.io/nerblackbox
About
Transformer-based language models like BERT have had a game-changing impact on Natural Language Processing.
In order to utilize Hugging Face’s publicly accessible pretrained models for Named Entity Recognition, one needs to retrain (or “fine-tune”) them using labeled text.
nerblackbox makes this easy.
You give it
a Dataset (labeled text)
a Pretrained Model (transformers)
and you get
the best Fine-tuned Model
its Performance on the dataset
Installation
pip install nerblackbox
Usage
see documentation: https://af-ai-center.github.io/nerblackbox
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nerblackbox-0.0.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 86f8acd45bff54fe2f97a3ecd5abe451d0682cc8fe3b2af367cf8026e09ed42e |
|
MD5 | 6ca4373c4f831bb8d2c666cb46789a93 |
|
BLAKE2b-256 | 2e438c87e4e21c3da1fee3645614d2645198bbdc23c887cbeee49670942739b1 |