Skip to main content

BENT: Biomedical Entity Annotator

Project description

Python Library for Named Entity Recognition (NER) and Linking (NEL) in the biomedical domain.

BENT can be used for:

  • Named Entity Recogniton (NER)

  • Named Entity Linking (NEL)

  • Named Entity Recognition and Linking (NER+NEL)

Access the full documentation.

Citation:

Pedro Ruas and Francisco M. Couto. `Nilinker: attention-based approach to nil entity linking.
Journal of Biomedical Informatics, 132:104137, 2022.
doi: https://doi.org/10.1016/j.jbi.2022.104137.

Installation

To use the current version of BENT it is required:

  • OS: Debian>=11/Ubuntu>=20.04

  • Python >=3.7, <=3.10.13

  • Required space between 5.5 GB - 10 GB * Dependencies: 2.5 GB * Data: between 3.0 GB (base) or 7.5 GB (if you use all available knowledge bases for Named Entity Linking)

If you have Docker installed in your system, the easiest way is to pull the BENT Docker image from DockerHub:

docker pull pedroruas18/bent

Alternatively, you can install the BENT package using pip:

pip install bent

After the pip installation, it is required a further step to install non-Python dependencies and to download the necessary data. Run in the command line:

bent_setup

Only the default knowledge bases ‘medic’ and ‘chebi’ will be available at this point.

To disable annoyng messages in the terminal run:

export TF_CPP_MIN_LOG_LEVEL='3'

You can download more knowledge bases later by specifying the desired knowledge bases among the ones that are available:

python -c "from bent.get_kbs import get_additional_kbs;get_additional_kbs([<kb1>, <kb2>])"

The following knowledge bases can be configured:

Example: to download the NCBI Taxonomy and the NCBI Gene run:

python -c "from bent.get_kbs import get_additional_kbs;get_additional_kbs(['ncbi_taxon', 'ncbi_gene'])"

Get started

To apply the complete pipeline of entity extraction (NER+NEL) set the arguments:

  • recognize: indicate that the NER module will be applied (‘True’)

  • link: indicate that the NEL module will be applied (‘True’)

  • types: entity types to recognize and the respective target knowledge bases.

  • in_dir: directory path containing the text files to be annotated (the directory must contain text files exclusively)

  • out_dir: the output directory that will contain the annotation files

Python example:

import bent.annotate as bt

bt.annotate(
        recognize=True,
        link=True,
        types={
         'disease': 'medic'
         'chemical': 'chebi',
         },
        in_dir='data/txt/',
        out_dir='data/ann/'
)

It is also possible to apply the pipeline (NER+NEL) to a string or a list or strings instantiated in the execution script.

To see more usage examples, access the documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bent-0.0.80.tar.gz (75.9 kB view details)

Uploaded Source

Built Distribution

bent-0.0.80-py3-none-any.whl (93.4 kB view details)

Uploaded Python 3

File details

Details for the file bent-0.0.80.tar.gz.

File metadata

  • Download URL: bent-0.0.80.tar.gz
  • Upload date:
  • Size: 75.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.5

File hashes

Hashes for bent-0.0.80.tar.gz
Algorithm Hash digest
SHA256 2e6c97dd50b3cb1e6a4dec96dbfab5d2526a7fcd12c78545a12e871639d0da82
MD5 2976be425062131c35e13c51ea92c92d
BLAKE2b-256 c3e113d18f6f32f46475ba2e8956a7ad4c8720ec7b487afaf1bdf106eab30335

See more details on using hashes here.

File details

Details for the file bent-0.0.80-py3-none-any.whl.

File metadata

  • Download URL: bent-0.0.80-py3-none-any.whl
  • Upload date:
  • Size: 93.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.5

File hashes

Hashes for bent-0.0.80-py3-none-any.whl
Algorithm Hash digest
SHA256 d838068997a2542c53ec2f294e38dee761c185074adeb5a2d4d6dcc327a32206
MD5 4ab91c74ab04f3df036dc6ad9dec00d7
BLAKE2b-256 c952962b656d944bc4bd766ca36606fafdbe0bb98463ed83bba1c9108e45f55e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page