Skip to main content

The XML-to-OCDS parser for the TEDective project based on lxml

Project description

ETL

Ruff REUSE status

The code in this repo is part of the TEDective project. It defines an ETL pipeline to transform European public procurement data from Tenders Electronic Daily (TED) into a format that's easier to handle and analyse. Primarily, the TED XMLs (and eForms, WIP) are transformed into Open Contracting Data Standard (OCDS) JSON and parquet files to ease importing the data into a:

  • Graph database (KuzuDB in our case, but processed data should be generic enough to support any graph database and a
  • Search engine (Meilisearch in our case)

Organizations are deduplicated using Splinkg and linked to their GLEIF identifiers (WIP) before they are imported into the graph database.

Table of Contents for ETL

Background

The TEDective project aims to make European public procurement data explorable for non-experts. This transformation is more or lest based on the Open Contracting Data Standard (OCDS) EU Profile:

As such, this pipeline can be used standalone or as part of your project that does something interesting with TED data. We use it ourselves for the TEDective API that powers the TEDective UI.

Install

:construction: Disclaimer: install instructions are working as of 12th of April 2024, but they may be subject of change.

The ETL consist of two parts the pipeline and the Luigi server (scheduler)

Using PyPi package

The easiest way to install TEDective ETL is to use PyPi package via pipx:

pipx install tedective-etl
pipx ensurepath # to make sure it has been added to your path
run-pipeline --help

Using Nix:

# Install flake into your profile
nix profile install git+https://git.fsfe.org/TEDective/etl
run-pipeline --help

Alternatively, you can clone this repository and build it via Nix yourself

# Cloning the repository and entering it
git clone https://git.fsfe.org/TEDective/etl && cd etl

# Nix build using the provided flake
nix build

# Disclaimer nix-commands and flakes are experimental features so you will need to add these flags to the command to be able to run them.
--extra-experimental-features 'nix-command flakes'
# You will also been prompted to accept/decline some extra configurations. You can accept them without receiving a prompt using this or manually decide without adding it:
--accept-flake-config

Manually

Another way is to use poetry directly.

After cloning this repo:

poetry install
poetry run run-pipeline --help

Usage

:construction: Disclaimer: usage instructions are working as of 12th of April 2024, but they may be subject of change.

Genaral usage options

run-pipeline [-h] [--first-month FIRST_MONTH] [--last-month LAST_MONTH]
                    [--meilisearch-url MEILISEARCH_URL] [--in-dir IN_DIR]
                    [--output-dir OUTPUT_DIR] [--graph-dir GRAPH_DIR] [--local-scheduler]

options:
  -h, --help            show this help message and exit
  --first-month FIRST_MONTH
                        The first month to process. Defaults to '2017-01'.
  --last-month LAST_MONTH
                        The last month to process. Defaults to the last month.
  --meilisearch-url MEILISEARCH_URL
                        The URL of the Meilisearch server. Defaults to
                        'http://localhost:7700'
  --in-dir IN_DIR       The directory to store the TED XMLs. Defaults to '/tmp/ted_notices'
  --output-dir OUTPUT_DIR
                        The directory to store the output data. Defaults to '/tmp/output'
  --graph-dir GRAPH_DIR
                        The name of the KuzuDB graph. Defaults to '/tmp/graph'
  --local-scheduler     Use the local scheduler.

Using PyPi package

After installation you should able to run both Luigi scheduler and pipeline:

run-server
# In different window
run-pipeline

Another extra thing that can be ran is a Meilisearch instance so that the search indexes can be built is meilisearch. It is NOT provided together with PyPi package, you can install it using your favourite package manager. It is recommended to install it if you plan to use the parsed data with TEDective API

Using Nix

# The nix build will create a result folder inside it you will find these scripts
# This is how you can get more information about the possible arguments you can provide to the script
result/bin/run-pipeline --help

# IMPORTANT: As we previously said there are two parts to the ETL this is how to spin up luigi so the pipeline can run
result/bin/run-server

# We suggest for development purposes to use the --last-month flag to have it quickly setup. You can also set the first-month if you would like a specific time window of data. By default first month is going to be 2017-01
run-pipeline --last-month 2017-02

In this case you can also run Meilisearch to build search indexes. That can be done inside the devenv more on that further down

Manually (using poetry)

Running the pipeline requires running luigi daemon. It is included in the project and you can run it with the following command:

poetry run run-server
# And pipeline itself in different window
poetry run run-pipeline

It is recommended to run Meilisearch as well, if using this method, you would have to install it manually as well.

Maintainers

@linozen
@micgor32

Contributing

1. Nix development environment

The easiest way to start developing if you are using nix is to use devenv via the provided flake.nix.

# If you have Nix installed
nix develop --impure
# This will drop you into a shell with all the dependencies installed
# And it will also require the experimental flags:
# Disclaimer nix-commands and flakes are experimental features so you will need to add these flags to the command to be able to run them.
--extra-experimental-features 'nix-command flakes'
# You will also been prompted to accept/decline some extra configurations. You can accept them without receiving a prompt using this or manually decide without adding it:
--accept-flake-config

# Inside you have all the needed tools

# These will provide you with the amazing kuzu-explorer which allows you to run queries to the database.
kuzu-up
# And
kuzu-down

# Inside the devenv you also have access to Mielisearch

# Inside the devenv pre-commits are setup with all other checks so that is the easiest way to make a commit to the repo.
2. Editing documentation

Small note: If editing the README, please conform to the standard-readme specification. Also, please ensure that documentation is kept in sync with the code. Please note that the main documentation repository is added to this repository via git-subrepo. To update the documentation, please use the following commands:

git-subrepo pull docs
cd ./docs

# Make your changes
git commit -am "docs: update documentation for new feature"

# Preview your changes
pnpm install
pnpm run dev

# If you're happy with your changes, push them
git-subrepo push docs

License

EUPL-1.2 © 2024 Free Software Foundation Europe e.V.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tedective_etl-0.1.3.tar.gz (36.7 kB view details)

Uploaded Source

Built Distribution

tedective_etl-0.1.3-py3-none-any.whl (38.6 kB view details)

Uploaded Python 3

File details

Details for the file tedective_etl-0.1.3.tar.gz.

File metadata

  • Download URL: tedective_etl-0.1.3.tar.gz
  • Upload date:
  • Size: 36.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.8 Linux/6.1.82

File hashes

Hashes for tedective_etl-0.1.3.tar.gz
Algorithm Hash digest
SHA256 11cacce27ddc20199bb729fe566059fc03fc3ada8e181b87e3afa81e1ad11f23
MD5 b3d3f52a89edee8c21ddc4a394efb8d1
BLAKE2b-256 98c8cba7da6c317b7995efdcab9c0483342043ec8856eb4baac376182f447e10

See more details on using hashes here.

File details

Details for the file tedective_etl-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: tedective_etl-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 38.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.8 Linux/6.1.82

File hashes

Hashes for tedective_etl-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 94705ff14f4206d5d23557bf146b93562eb7e3100dec371164e63b5fd8d18084
MD5 9eddb7ad08c6d7b3b21d4aafd63b2280
BLAKE2b-256 5061980db5c459a4649db31be81116531bcde3abaca956dcc6e006a4ed7dc63a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page