Skip to main content

A fast Python implementation of the extended LESK algorithm for Word-Sense Disambiguation (WSD)

Project description

Le's Lesk

A fast Python 3 Word-Sense Disambiguation package (WSD) using the extended LESK algorithm

Install

lelesk is available on PyPI and can be installed using pip

pip install lelesk

Lelesk uses NLTK lemmatizer and yawlib wordnet API. To install NLTK data, start a Python prompt, import nltk and then download the required data

$ python3
Python 3.6.9 (default, Jan 26 2021, 15:33:00) 
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import nltk
>>> nltk.download(['stopwords', 'punkt', 'averaged_perceptron_tagger', 'wordnet'])

Download and extract yawlib pre-built databases to ~/wordnet.

For more information:

Command-line tools

To disambiguate a sentence, run this command on the terminal:

python3 -m lelesk wsd "I go to the bank to get money."

To perform word-sense disambiguation on a text file, prepare a text file with each line is a sentence.

For example here is the content of the file demo.txt

I go to the bank to withdraw money.
I sat at the river bank.

you then can run the following command

# output to TTL/JSON (a single file)
python3 -m lelesk file demo.txt demo_wsd_output.json --ttl json

# output to TTL/TSV (multiple TSV files)
python3 -m lelesk file demo.txt demo_wsd_output.json --ttl tsv

Issues

If you have any issue, please report at https://github.com/letuananh/lelesk/issues

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lelesk-0.1.tar.gz (15.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page