Skip to main content

This is a python module to create SPARQL queries for the EU Cellar repository, run them and subsequently download their data. Notably, it directly supports all resource types.

Project description

pyeurlex package

This is a python module to create SPARQL queries for the EU Cellar repository, run them and subsequently download their data. Notably, it directly supports all resource types. Some parts like the SPARQL queries are based on the R-based eurlex package by Michal Ovadek, but then I wanted one for python and was not satisfied with existing python packages.

Usage

Import and instantiate the moduel

from eurlex import Eurlex
eur = Eurlex()

Then you can construct you query. (or alternatively you can use your own or one constructed via the wizard https://op.europa.eu/en/advanced-sparql-query-editor

q = eur.make_query(resource_type = "caselaw", order = True, limit = 10)
print(q)

Finally, you can run this query.

d = eur.query_eurlex(q)  # where q is a query generated in a previous step or a string defined by you
print(d)

This will return a pandas data frame of the results. Its columns depend on the the fields that you included. At the moment, not all fields are named properly in the dataframe and you will have to set their name manually if desired.

Once you pick a single url or identifier from the df, you can download a notice or data based on that indentifier. To download the notices as xml, use download_xml() as below.

x = eur.download_xml("32014R0001", notice="tree") # without the file parameter to specify the filename, the celex number will be used.
print(x)

To get data associated with an identifier, use get_data(). This will return the data as a string,

d = eur.get_data("http://publications.europa.eu/resource/celex/32016R0679", type="text")
print(d)

Why another package/module?

While there was already the R packages by Michal Ovadek, I wanted a python implementation. There is also https://github.com/seljaseppala/eu_corpus_compiler but that also only does regulatory/legislative documents. Additionally, there is https://pypi.org/project/eurlex/, but it for example does not have a way to generate SPARQL queries and is also very focused on legislation. In addition, while internally it uses SPARQL and cellar as well, its documentation is focused on accessing and processing documents via CELEX number, which is not really helpful to me. Another one is https://github.com/Lexparency/eurlex2lexparency which also seems to focus on legislative documents and

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyeurlex-0.2.1.tar.gz (24.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyeurlex-0.2.1-py3-none-any.whl (23.5 kB view details)

Uploaded Python 3

File details

Details for the file pyeurlex-0.2.1.tar.gz.

File metadata

  • Download URL: pyeurlex-0.2.1.tar.gz
  • Upload date:
  • Size: 24.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0 CPython/3.10.6 Linux/5.15.0-1017-azure

File hashes

Hashes for pyeurlex-0.2.1.tar.gz
Algorithm Hash digest
SHA256 bc03cda9800051f10d61720614edee59a7c8f0b4fa9ff1031cb7f8168be37897
MD5 ec69d1d8464aa13f8dcbc2c4a4dba199
BLAKE2b-256 af5b87150fca268d71c7bc295c2f699714f56fbb353890f1f679b09579470531

See more details on using hashes here.

File details

Details for the file pyeurlex-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: pyeurlex-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 23.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0 CPython/3.10.6 Linux/5.15.0-1017-azure

File hashes

Hashes for pyeurlex-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7bcaf4aa6d7622daf7eaae539fffd521adde625c6fd2851844e1d646a0a7154d
MD5 638ff2808f68c1b4ed08728a5589847f
BLAKE2b-256 51cba14a464ab9ede7bfed9e87ea9f44dd5acf1dff607c64cbe3b984df688ba2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page