Skip to main content

This is a python module to create SPARQL queries for the EU Cellar repository, run them and subsequently download their data. Notably, it directly supports all resource types.

Project description

pyeurlex package

This is a python module to create SPARQL queries for the EU Cellar repository, run them and subsequently download their data. Notably, it directly supports all resource types. Some parts like the SPARQL queries are based on the R-based eurlex package by Michal Ovadek, but then I wanted one for python and was not satisfied with existing python packages.

Usage

Import and instantiate the moduel

from eurlex import Eurlex
eur = Eurlex()

Then you can construct you query. (or alternatively you can use your own or one constructed via the wizard https://op.europa.eu/en/advanced-sparql-query-editor

q = eur.make_query(resource_type = "caselaw", order = True, limit = 10)
print(q)

Finally, you can run this query.

d = eur.query_eurlex(q)  # where q is a query generated in a previous step or a string defined by you
print(d)

This will return a pandas data frame of the results. Its columns depend on the the fields that you included. At the moment, not all fields are named properly in the dataframe and you will have to set their name manually if desired.

Once you pick a single url or identifier from the df, you can download a notice or data based on that indentifier. To download the notices as xml, use download_xml() as below.

x = eur.download_xml("32014R0001", notice="tree") # without the file parameter to specify the filename, the celex number will be used.
print(x)

To get data associated with an identifier, use get_data(). This will return the data as a string,

d = eur.get_data("http://publications.europa.eu/resource/celex/32016R0679", type="text")
print(d)

Why another package/module?

While there was already the R packages by Michal Ovadek, I wanted a python implementation. There is also https://github.com/seljaseppala/eu_corpus_compiler but that also only does regulatory/legislative documents. Additionally, there is https://pypi.org/project/eurlex/, but it for example does not have a way to generate SPARQL queries and is also very focused on legislation. In addition, while internally it uses SPARQL and cellar as well, its documentation is focused on accessing and processing documents via CELEX number, which is not really helpful to me. Another one is https://github.com/Lexparency/eurlex2lexparency which also seems to focus on legislative documents and

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyeurlex-0.2.0.tar.gz (23.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyeurlex-0.2.0-py3-none-any.whl (23.2 kB view details)

Uploaded Python 3

File details

Details for the file pyeurlex-0.2.0.tar.gz.

File metadata

  • Download URL: pyeurlex-0.2.0.tar.gz
  • Upload date:
  • Size: 23.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0 CPython/3.10.6 Linux/5.15.0-1017-azure

File hashes

Hashes for pyeurlex-0.2.0.tar.gz
Algorithm Hash digest
SHA256 15d0d6f3c6ab0ea47ac530880509036fb0745fee5b02abd557db79c05fd0625c
MD5 e06f6a0ba455b1cb1e60f41fa10da9e2
BLAKE2b-256 ad75eea8bdb2f5558480fa4dbdee1b37b515b6ffa70f317e935ef5e37885c8f0

See more details on using hashes here.

File details

Details for the file pyeurlex-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: pyeurlex-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 23.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0 CPython/3.10.6 Linux/5.15.0-1017-azure

File hashes

Hashes for pyeurlex-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c45a0e963896fbe0297b863051d17244c4757217fc0c3c30dda80cc1fea8493d
MD5 8bce02822741cb9cd86402a17b18dd42
BLAKE2b-256 bcbc3ce366a81e47a29b5cecf5687a39c39cb0b190bcc532289ed2785df58773

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page