A set of utilities for processing MediaWiki XML dump data.
Project description
# MediaWiki XML
This library contains a collection of utilities for efficiently processing MediaWiki’s XML database dumps. There are two important concerns that this module intends to address: complexity and performance of streaming XML parsing. This library enables memory efficent stream processing of XML dumps with a simple [iterator](https://pythonhosted.org/mwxml/iteration.html) strategy. This library also implements a distributed processing strategy (see [map()](https://pythonhosted.org/mwxml/map.html)) that enables parallel processing of many XML dump files at the same time.
Installation: pip install mwxml
Documentation: https://pythonhosted.org/mwxml
Repositiory: https://github.com/mediawiki-utilities/python-mwxml
License: MIT
## Example
>>> import mwxml >>> >>> dump = mwxml.Dump.from_file(open("dump.xml")) >>> print(dump.site_info.name, dump.site_info.dbname) Wikipedia enwiki >>> >>> for page in dump: ... for revision in page: ... print(revision.id) ... 1 2 3
## Author * Aaron Halfaker – https://github.com/halfak
## See also * http://dumps.wikimedia.org/ * http://community.wikia.com/wiki/Help:Database_download
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mwxml-0.2.1-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ad49b9251fd8d5fd5ad30dfff571fb8e4d8015426bb831336be551961417b0e6 |
|
MD5 | 6fc035e7a9e25342e57aa648f4a559d8 |
|
BLAKE2b-256 | 43b375c580fd467243347f1b7ad4ce5435d74a2b1ffd124f57c9ca8998a655d0 |