Skip to main content

Wikipedia and MediaWiki API wrapper for Python

Project description

License GitHub release Build Status Test Coverage Codacy Review PyPi Release Downloads

*mediawiki* is a python wrapper and parser for the MediaWiki API. The goal is to allow users to quickly and efficiently pull data from the MediaWiki site of their choice instead of worrying about dealing directly with the API. As such, it does not force the use of a particular MediaWiki site. It defaults to Wikipedia but other MediaWiki sites can also be used.

MediaWiki wraps the MediaWiki API so you can focus on leveraging your favorite MediaWiki site’s data, not getting it. Please check out the code on github!

Note: this library was designed for ease of use and simplicity. If you plan on doing serious scraping, automated requests, or editing, please look into Pywikibot which has a larger API, advanced rate limiting, and other features so we may be considerate of the MediaWiki infrastructure. Pywikibot has also other extra features such as support for Wikibase (that runs Wikidata).

Installation

Pip Installation:

$ pip install pymediawiki

To install from source:

To install mediawiki, simply clone the repository on GitHub, then run from the folder:

$ python setup.py install

mediawiki supports python versions 3.5 - 3.9

For python 2.7 support, install release 0.6.7

$ pip install pymediawiki==0.6.7

Documentation

Documentation of the latest release is hosted on readthedocs.io

To build the documentation yourself run:

$ pip install sphinx
$ cd docs/
$ make html

Automated Tests

To run automated tests, one must simply run the following command from the downloaded folder:

$ python setup.py test

Quickstart

Import mediawiki and run a standard search against Wikipedia:

>>> from mediawiki import MediaWiki
>>> wikipedia = MediaWiki()
>>> wikipedia.search('washington')

Run more advanced searches:

>>> wikipedia.opensearch('washington')
>>> wikipedia.allpages('a')
>>> wikipedia.geosearch(title='washington, d.c.')
>>> wikipedia.geosearch(latitude='0.0', longitude='0.0')
>>> wikipedia.prefixsearch('arm')
>>> wikipedia.random(pages=10)

Pull a MediaWiki page and some of the page properties:

>>> p = wikipedia.page('Chess')
>>> p.title
>>> p.summary
>>> p.categories
>>> p.images
>>> p.links
>>> p.langlinks

See the documentation for more examples!

Changelog

Please see the changelog for a list of all changes.

License

MIT licensed. See the LICENSE file for full details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pymediawiki-0.7.3.tar.gz (2.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pymediawiki-0.7.3-py3-none-any.whl (23.8 kB view details)

Uploaded Python 3

File details

Details for the file pymediawiki-0.7.3.tar.gz.

File metadata

  • Download URL: pymediawiki-0.7.3.tar.gz
  • Upload date:
  • Size: 2.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for pymediawiki-0.7.3.tar.gz
Algorithm Hash digest
SHA256 5c1f75c509df68c8acd835e814cd12b72a8af038dd3bdd3103fb0a79e8299a24
MD5 ebae6a3ec0ac8fdd463923b7faae9c65
BLAKE2b-256 175ae9b9e20030e854f78384fd992bf5ac3c5642fa5a4f0b08057a23bde5a986

See more details on using hashes here.

File details

Details for the file pymediawiki-0.7.3-py3-none-any.whl.

File metadata

  • Download URL: pymediawiki-0.7.3-py3-none-any.whl
  • Upload date:
  • Size: 23.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for pymediawiki-0.7.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ca0a052007b02981c424bc208476bfae6f58495b39574857b5bd415fa2d48868
MD5 8d9d1b4d1801c6fa9c5f9e9ed74da48c
BLAKE2b-256 e5a80e9716ab08bea67cdb39262f80c9e5a5bdc92d491259cd0fa810dcf11a48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page