Bindings to a rust library that can help process wikipedia's data dumps.
Project description
Wikicleaner
Wikipedia provides a series of dumps in XML format. While developing this project, I personally used the simple wiki dumps.
Purpose
The intent of this project is to provide some low-level routines that allow us to manipulate the content of XML dumps. One such routine removes all of the annotations like [[File:*]]
and {{some metadata}}
in order to make our input more digestible for an AI application that uses embeddings.
Example Usage
>>> import wikicleaner as wc
>>> article_raw_text = "[[File:Colorful spring garden.jpg|thumb|180px|right|[[Spring]] flowers in April in the [[Northern Hemisphere]].]] April comes between [[March]] and [[May]]"
>>> wc.clean_article_text(article_raw_text)
' April comes between March and May'
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
wikicleaner-0.1.7.tar.gz
(24.2 kB
view hashes)
Built Distribution
Close
Hashes for wikicleaner-0.1.7-cp311-cp311-manylinux_2_34_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 56d38be6a8a740bb0cb522b1dcc819b6594dc05d6345931846432e6f9ea8a3fa |
|
MD5 | fa57c710c0e0ba155a0f5b05efd94399 |
|
BLAKE2b-256 | 83bb686d1c655e752943f2db39e6421eaf2cb58a0a32ae2f8a6cfca4f8c5e7cf |