Skip to main content

A minimal modern data stack with working data pipelines in a single Docker container.

Project description

contributors-shield Forks Stargazers Issues MIT License LinkedIn

current release

Logo

mimosa

The ELT part of a modern data stack with practical data pipelines using cloud functionality.
Explore the docs »

Report Bug · Request Feature


Table of Contents
  1. About the Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. Contact

(back to top)

About the Project



The ELT part of a modern data stack with practical data pipelines and reporting using cloud functionality. This is similar in concept to mimodast using alternative software options and cloud functionality.

Mimosa encompasses the ELT (extract load transform) components necessary to generate the webpage found at gas.aspireto.win, providing detailed reports on natural gas storage volumes within the European Union. This process involves retrieving data from a REST API, transforming it, and storing it in a database tailored for reporting purposes.

The source data is published by Gas Infrastructure Europe and exposed in a REST API.

Beyond gas storage data, Mimosa offers a hands-on experience with essential tools:

  • 🚀 dlt for smooth data loading.
  • 🔍 dbt for powerful data transformation.
  • ☁️ MotherDuck for storing the data in a cloud based DuckDB database.

Further the full tech stack used to create the gas.aspireto.win pages is detailed below.


(back to top)

Getting Started

Prerequisites

Setup a Python development environment.

API Keys

Ensure the following sensitive information is securely stored in environment variables or within a .env file:

  • To access the GIE Gas Inventory REST API, an API key is necessary. Quickly obtain your API key by signing up for a free GIE account. Once acquired, expose it using the following environment variable:

    • ENV_GIE_XKEY = "YOUR-API-KEY"
  • For MotherDuck, you'll need the service token and the database name. Set up the following environment variables to establish the connection:

    • DESTINATION__MOTHERDUCK__CREDENTIALS = "md:///YOUR-DATABASE-NAME?token=YOUR-SERVICE-TOKEN"
    • Please note that the MotherDuck page utilizes a different format, whereas the above format is specifically required for dlt.

(back to top)

Installation

Execute the following command. Consider using a venv.

pip install ternyxmimosa

Alternatively clone this repository and use poetry install. Or pip install from GitHub.

(back to top)

Usage

Command Line

Not currently supported.

As a Python Package

The following sample obtains the storage data for the last available date and stores it in MotherDuck.

import mimosa.cli as GEI

GEI.main()

Tech Stack

These are the technologies driving the content on gas.aspireto.win:

  • Google cloud function for the ELT component:
    • The function is a bare bones wrapper around the mimosa Python package (the current repository). The function is in this repository.
    • It is scheduled to run the ELT twice daily (using Google Scheduler and Pub/Sub message).
    • The result is updated data in MotherDuck.
  • Reporting notebook
    • Built using the evidence reporting tool, defined in this GitHub repository.
    • Rebuild and published to a web host using a GitHub workflow.
      • Run on a twice daily schedule. The workflow is defined in the notebook repository.

NOTE: As of November 2023 it is possible to fully deploy this stack without breaking the bank (using free tiers of the cloud services used). Dive into our GitHub repository and the linked ones for the Google Function and Evidence notebook, where all the code awaits. 🚀

(back to top)

NOTE: For some reason the environment variable DESTINATION__MOTHERDUCK__CREDENTIALS is oftentimes incorrectly set between runs when using the dev container. Use unset DESTINATION__MOTHERDUCK__CREDENTIALS to clear the environment variable.

Roadmap

Consider:
  • Get source data (Using REST API)
  • Transform data, possibly SQL Mesh or dbt.
  • dlt update/error messages using Slack
  • Storage (currently local DuckDB, maybe consider some cloud alternative. Though that would stray from the data stack in a Docker concept.) (MotherDuck)
  • Scheduling Tool (Google Cloud Scheduler)
  • Reporting tool (Metabase?) (Evidence.dev in separate repository)
  • Bare bones CLI

(back to top)

Contributing

Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also open a feature request or bug report. Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

Contact

Project Link: mimosa

(back to top)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ternyxmimosa-0.4.2.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

ternyxmimosa-0.4.2-py3-none-any.whl (23.3 kB view details)

Uploaded Python 3

File details

Details for the file ternyxmimosa-0.4.2.tar.gz.

File metadata

  • Download URL: ternyxmimosa-0.4.2.tar.gz
  • Upload date:
  • Size: 22.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.4 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for ternyxmimosa-0.4.2.tar.gz
Algorithm Hash digest
SHA256 a52e210d116956f9d415717d7f28d803cbb8458bbb8aa57d881dc2d6e1b6a41e
MD5 387fc6faf7fd165d48c5314b9837c504
BLAKE2b-256 e997e496a74a699fb36f3bdf3e43a638724e0f2fd71e115b64aa9e791d2f8b19

See more details on using hashes here.

File details

Details for the file ternyxmimosa-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: ternyxmimosa-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 23.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.4 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for ternyxmimosa-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3b11d54f66aa33b05600408e7ed4b8e6441b74259d5e92cf0b8c151c77a6bbd8
MD5 9b6fe68e5cac5bc89a73d3e42d577315
BLAKE2b-256 301d39b752830abe2cdcadd6928229fc18075e6cc1f5fa6da09223d5f67b6232

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page