Skip to main content

Python package for creating labeled examples from wiki dumps

Project description

Wikipedia NER
-------------

Tool to train and obtain named entity recognition labeled examples
from Wikipedia dumps.

Usage in [IPython notebook](http://nbviewer.ipython.org/github/JonathanRaiman/wikipedia_ner/blob/master/Wikipedia%20to%20Named%20Entity%20Recognition.ipynb) (*nbviewer* link).

## Usage

Here is an example usage with the first 200 articles from the english wikipedia dump (dated lated 2013):

parseresult = wikipedia_ner.parse_dump("enwiki.bz2",
max_articles = 200)
most_common_category = wikipedia_ner.ParsedPage.categories_counter.most_common(1)[0][0]

most_common_category_children = [
parseresult.index2target[child] for child in list(wikipedia_ner.ParsedPage.categories[most_common_category].children)
]

"In '%s' the children are %r" % (
most_common_category,
", ".join(most_common_category_children)
)

#=> "In 'Category : Member states of the United Nations' the children are 'Afghanistan, Algeria, Andorra, Antigua and Barbuda, Azerbaijan, Angola, Albania'"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wikipedia-ner-0.0.7.tar.gz (73.5 kB view details)

Uploaded Source

File details

Details for the file wikipedia-ner-0.0.7.tar.gz.

File metadata

File hashes

Hashes for wikipedia-ner-0.0.7.tar.gz
Algorithm Hash digest
SHA256 1c290ea1d4af2f6e0d2ca867a32d91c1443c3746b2cf75eed0d57ea88a2a19b4
MD5 1eae40c5b140a02528a4e281d9d6c041
BLAKE2b-256 9190a0d8a18c81747c5d6eb34c55b75905902ee56dcf5fb54e3e753ab00b8f04

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page