Break WikiData dumps into smaller knowledge graphs
Project description
Breaking WikiData dumps into smaller knowledge graphs (e.g. graph of human entities).
Free software: BSD license
Documentation: https://wikidatasets.readthedocs.io.
Dataset downloads: here
Citations
If you find this code useful in your research, please consider citing our paper:
@article{boschin_wikidatasets_2019,
title = {{WikiDataSets}: {Standardized} sub-graphs from {Wikidata}},
shorttitle = {{WikiDataSets}},
url = {http://arxiv.org/abs/1906.04536},
journal = {arXiv:1906.04536 [cs, stat]},
author = {Boschin, Armand and Bonald, Thomas},
month = oct,
year = {2019},
note = {arXiv: 1906.04536},
keywords = {Computer Science - Artificial Intelligence, Computer Science - Machine Learning, Computer Science - Social and Information Networks, Statistics - Machine Learning}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
wikidatasets-0.3.0.tar.gz
(6.9 kB
view hashes)
Built Distribution
Close
Hashes for wikidatasets-0.3.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9c57dff4b627bf362c5b33c7d877a192beca12f151c89b411038827e716390f4 |
|
MD5 | be6b4a12dd09cd7b96ed8402049584bb |
|
BLAKE2b-256 | 284da2e856c35bdf2a6bcd538d70480f2a3ec5bdcdb805c7df392d347c94ad47 |