Python based ETL and ELT framework for Wikidata
Project description
Python based ETL and ELT framework for Wikidata
Jump to: Data Query Data Upload Data Maps Query Maps Upload Maps To-Do
wikirepo is a Python package that provides an ETL framework to easily source and leverage standardized Wikidata information. The current focus is to create an intuitive interface so that Wikidata can function as a common repository for public social science statistics.
Installation via PyPi
pip install wikirepo
import wikirepo
Data
wikirepo's data structure is built around Wikidata.org. Human-readable access to Wikidata statistics is achieved through converting requests into Wikidata's Query IDs (QIDs) and Property IDs (PIDs), with the Python package wikidata serving as a basis for data loading and indexing. The wikirepo community aims to work with Wikidata to derive and add needed statistics, thus playing an integral role in growing the premier free and open sourced online knowledge base.
Query Data
wikirepo's main ETL access function, wikirepo.data.query, returns a pandas.DataFrame
of locations and property data across time. wikirepo.data.query
accessses data.data_utils.query_repo_dir
, with desired statistics coming from the query_prop_data
functions of wikirepo/data directory modules, and results then being merged across modules and directories.
The query structure streamlines not just data extraction, but also the process of adding new wikirepo properties for all to use. Adding a new property is as simple as adding a module to an appropriate wikirepo/data directory, with most data modules being as simple as six defined variables and a single function call. wikirepo is self indexing, so any property module added is accessible by wikirepo.data.query
. See data/demographic/population for the general structure of data modules, and examples/add_property for a quick demo on adding new properties to wikirepo.
Each query needs the following inputs:
- locations: the locations that data should be queried for
- Strings are accepted for
Earth
, continents, and countries] - The user can also pass Wikidata QIDs directly
- Strings are accepted for
- depth: the geographic level of the given locations to query
- A depth of 0 is the locations themselves
- Greater depths correspond to lower geographic levels (states of countries, etc.)
- A dictionary of locations is generated for lower depths (see second example below)
- time_lvl:
yearly
,monthly
,weekly
, ordaily
as strings- If not provided, then the most recent data will be retrieved with annotation for when it's from
- timespan: start and end
datetime.date
objects to be subsetted based ontime_lvl
- Further arguments: the names of modules in wikirepo/data's directories
- These are passed to the arguments corresponding to their directories
- Data will be queried for these properties for the given
locations
,depth
,time_lvl
andtimespan
, with results being merged as dataframe columns
Queries are also able to access information in Wikidata sub-pages for locations. For example: if inflation rate is not found on the location's main page, then wikirepo checks the location's economic topic page as inflation_rate.py is found in wikirepo/data/economic (see Germany and economy of Germany).
wikirepo further provides a unique dictionary class, EntitiesDict
, that stores all loaded Wikidata entities during a query. This speeds up data retrieval, as entities are loaded once and then accessed in the EntitiesDict
object for any other needed properties.
Examples of wikirepo.data.query
follow:
Querying information for given countries
import wikirepo
from wikirepo.data import wd_utils
from datetime import date
ents_dict = wd_utils.EntitiesDict()
# Strings must match their Wikidata English page name
countries = ["Germany", "United States of America", "People's Republic of China"]
depth = 0
time_lvl = 'yearly'
timespan = (date(2009,1,1), date(2010,1,1))
df = wikirepo.data.query(ents_dict=ents_dict,
locations=countries, depth=depth,
time_lvl=time_lvl, timespan=timespan,
demographic_props=['population', 'life_expectancy'],
economic_props='median_income',
electoral_poll_props=False,
electoral_result_props=False,
geographic_props=False,
institutional_props='human_dev_idx',
political_props='executive',
misc_props=False,
verbose=True)
col_order = ['location', 'qid', 'year', 'executive', 'population',
'life_exp', 'human_dev_idx', 'median_income']
df = df[col_order]
df.head(6)
location | qid | year | executive | population | life_exp | human_dev_idx | median_income |
---|---|---|---|---|---|---|---|
Germany | Q183 | 2010 | Angela Merkel | 8.1752e+07 | 79.9878 | 0.921 | 33333 |
Germany | Q183 | 2009 | Angela Merkel | nan | 79.8366 | 0.917 | nan |
United States of America | Q30 | 2010 | Barack Obama | 3.08746e+08 | 78.5415 | 0.914 | 43585 |
United States of America | Q30 | 2009 | George W. Bush | nan | 78.3902 | 0.91 | nan |
People's Republic of China | Q148 | 2010 | Wen Jiabao | 1.35976e+09 | 75.236 | 0.706 | nan |
People's Republic of China | Q148 | 2009 | Wen Jiabao | nan | 75.032 | 0.694 | nan |
Querying information for all US counties
# Note: >3000 regions, expect an hour runtime
import wikirepo
from wikirepo.data import lctn_utils, wd_utils
from datetime import date
ents_dict = wd_utils.EntitiesDict()
depth = 2 # 2 for counties, 1 for states and territories
country = "United States of America"
sub_lctns = True # for all
time_lvl = 'yearly'
# Only valid sub-locations given the timespan will be queried
timespan = (date(2016,1,1), date(2018,1,1))
us_counties_dict = lctn_utils.gen_lctns_dict(ents_dict=ents_dict,
depth=depth,
locations=country,
sub_lctns=sub_lctns,
time_lvl=time_lvl,
timespan=timespan,
verbose=True)
df = wikirepo.data.query(ents_dict=ents_dict,
locations=us_counties_dict, depth=depth,
time_lvl=time_lvl, timespan=timespan,
demographic_props='population',
economic_props=False,
electoral_poll_props=False,
electoral_result_props=False,
geographic_props='area',
institutional_props='capital',
political_props=False,
misc_props=False,
verbose=True)
df[df['population'].notnull()].head(6)
location | sub_lctn | sub_sub_lctn | qid | year | population | area_km2 | capital |
---|---|---|---|---|---|---|---|
United States of America | California | Alameda County | Q107146 | 2018 | 1.6602e+06 | 2127 | Oakland |
United States of America | California | Contra Costa County | Q108058 | 2018 | 1.14936e+06 | 2078 | Martinez |
United States of America | California | Marin County | Q108117 | 2018 | 263886 | 2145 | San Rafael |
United States of America | California | Napa County | Q108137 | 2018 | 141294 | 2042 | Napa |
United States of America | California | San Mateo County | Q108101 | 2018 | 774155 | 1919 | Redwood City |
United States of America | California | Santa Clara County | Q110739 | 2018 | 1.9566e+06 | 3377 | San Jose |
Upload Data
wikirepo.data.upload will be the core of the eventual wikirepo ELT process. The goal is to reocrd edits that a user makes to a prveviously queried dataframe such that these changes can then be pushed back to Wikidata. This process could be as simple as making changes to a df.copy()
of a queried dataframe, and then using pandas to compare the new and original dataframes after the user has added information that they have access to. The unique information in the edited dataframe could then be loaded into Wikidata for all to use.
The same process that is used to query information from Wikidata could be reversed for the upload process. wikirepo/data property modules could all have a corresponding upload_prop_data
function that would link dataframe columns to their corresponding Wikidata properties, indicate if the time qualifiers are a points in time or spans using start time and end time through the defined variables, and other necessary qualifiers for proper data indexing could also be derived. Source information could also be added in corresponding columns to the given property edits.
Put simply: a full featured wikirepo.data.upload
function would realize the potential of a single open-source repository for all public social science information.
Maps
wikirepo/maps is a further goal of the project, as it combines wikirepo's focus on easy to access open source data and quick high level analytics.
Query Maps
As in wikirepo.data.query, passing the depth
, locations
, time_lvl
and timespan
arguments could access GeoJSON files stored on Wikidata, thus providing mapping files in parallel to the user's data. These files could then be leveraged using existing Python plotting libraries to provide detailed presentations of geographic analysis.
Upload Maps
Similar to the potential of adding statistics through wikirepo.data.upload, GeoJSON map files could also be uploaded to Wikidata using appropriate arguments. The potential exists for a myriad of variable maps given depth
, locations
, time_lvl
and timespan
information that would allow all wikirepo users to get the exact mapping file that they need for their given task.
To-Do
Expanding wikirepo's data infrastructure
The growth of wikirepo's database relies on that of Wikidata. Beyond simply adding entries to already existing properties, the following are examples of property types that could be included:
- Those for electoral polling and results for locations
- This would allow direct access to all needed election information in a single function call
- Data could be added to Wikidata sub-pages for locations and linked to via
data.wd_utils.dir_to_topic_page
- A property that links political parties and their regions in data/political
- For easy professional presentation of electoral results (ex: loading in party hex colors, abbreviations, and alignments)
- data/demographic properties such as:
- age, education, religious, and linguistic diversities across time
- data/economic properties such as:
- female workforce participation, workforce industry diversity, wealth diversity, and total working age population across time
- Distinct properties for Freedom House and Press Freedom indexes, as well as other descriptive metrics
- These could be added to data/institutional
Further ways to help
- Integrating current Python tools with wikirepo ETL structures for ELT uploads to Wikidata
- Adding multiprocessing support to the
wikirepo.data.query
process anddata.lctn_utils.gen_lctns_dict
- Optimizing wikirepo.data.query:
- Potentially converting
EntitiesDict
andLocationsDict
to slotted object classes for memory savings - Deriving and optimizing other slow parts of the query process
- Potentially converting
- Adding access to GeoJSON files for mapping via wikirepo.maps.query
- Designing and adding GeoJSON files indexed by time properties to Wikidata
- Creating and improving examples, as well as sharing them around the web
- Testing for wikirepo
- A read the docs page
Similar Packages
Python
JavaScript
Powered By
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file wikirepo-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: wikirepo-0.0.2-py3-none-any.whl
- Upload date:
- Size: 113.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.0 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c745453a060bda1c25f340fe6af3892600c131aa280bb47e1e6e4802bf67fe28 |
|
MD5 | fb9c33a2a76d6e9721cd8388261ce9f1 |
|
BLAKE2b-256 | 03a6dac99e86d21226bcde32099cd4fe6488e0d994724d626592ec908b72f35d |