Skip to main content

Core ETL pipeline framework for mkpipe.

Project description

MkPipe

MkPipe is a modular, open-source ETL (Extract, Transform, Load) tool that allows you to integrate various data sources and sinks easily. It is designed to be extensible with a plugin-based architecture that supports extractors, transformers, and loaders.

Features

  • Extract data from multiple sources (e.g., PostgreSQL, MongoDB).
  • Transform data using custom Python logic and Apache Spark.
  • Load data into various sinks (e.g., ClickHouse, PostgreSQL, Parquet).
  • Plugin-based architecture that supports future extensions.
  • Cloud-native architecture, can be deployed on Kubernetes and other environments.

Quick Setup

You can deploy MkPipe using one of the following strategies:

1. Using Docker Compose

This method sets up all required services automatically using Docker Compose.

Steps:

  1. Clone or copy the deploy folder from the repository.

  2. Modify the configuration files:

  3. Run the following command to start the services:

    docker-compose up --build
    

    This will set up the following services:

    • PostgreSQL: Required for data storage.
    • RabbitMQ: Required for the Celery run_coordinator=celery.
    • Celery Worker: Required for running the Celery run_coordinator=celery.
    • Flower UI: Optional, but required for monitoring Celery tasks.

    Note: If you only want to use the run_coordinator=singlewithout Celery, only PostgreSQL is necessary.

2. Running Locally

You can also set up the environment manually and run MkPipe locally.

Steps:

  1. Set up and configure the following services:
    • RabbitMQ: Required for the Celery run_coordinator.
    • PostgreSQL: Required for data storage.
    • Flower UI: Optional, but required for monitoring Celery tasks.
  2. Update the following configuration files in the deploy folder:
    • .env for environment variables.
    • mkpipe_project.yaml for your ETL configurations.
  3. Install the python packages
    pip install mkpipe mkpipe-extractor-postgres mkpipe-loader-postgres
    
  4. Set the project directory environment variable:
    export MKPIPE_PROJECT_DIR={YOUR_PROJECT_PATH}
    
  5. Start MkPipe using the following command:
    mkpipe run
    

Documentation

For more detailed documentation, please visit the GitHub repository.

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Db Support Plan

For actively supported databases/plugins, please visit the MkPipe-hub repository!

Core Relational Databases

  • PostgreSQL
  • MySQL
  • MariaDB
  • SQL Server
  • Oracle Database
  • SQLite
  • Snowflake
  • Google BigQuery
  • Amazon Redshift
  • ClickHouse
  • Amazon S3

NoSQL Databases

  • MongoDB
  • Cassandra
  • DynamoDB
  • Redis
  • Azure Data Lake Storage (ADLS)
  • Google Cloud Storage
  • Elasticsearch
  • TimescaleDB
  • HDFS
  • InfluxDB

ERP/CRM Systems

  • Salesforce
  • SAP
  • Microsoft Dynamics
  • NetSuite
  • Workday
  • HubSpot
  • Zoho CRM
  • Freshsales
  • Zendesk
  • Oracle NetSuite

Emerging Databases & Analytical Tools

Apache Druid

  • Vertica
  • SingleStore (MemSQL)
  • Exasol
  • SAP HANA
  • IBM Db2
  • Neo4j (Graph Database)
  • Greenplum
  • CockroachDB
  • AWS Athena

Streaming Systems

  • Kafka
  • RabbitMQ
  • Pulsar
  • Apache Flink
  • Amazon Kinesis
  • Google Pub/Sub
  • Azure Event Hubs
  • Apache NiFi
  • ActiveMQ
  • Redpanda

File Formats & Data Lakes

  • Parquet
  • Avro
  • JSON
  • CSV
  • XML
  • ORC
  • Google Drive (for raw files)
  • Dropbox
  • Box
  • FTP/SFTP Servers

Specialized Analytics Tools

  • Metabase (Data Visualization)
  • Tableau Data Extracts
  • Power BI
  • Looker
  • Google Analytics (GA4)
  • Mixpanel
  • Amplitude
  • Adobe Analytics
  • Heap
  • Klipfolio

Industry-Specific Databases

  • Aerospike
  • RocksDB
  • FaunaDB
  • ScyllaDB
  • ArangoDB
  • MarkLogic
  • CrateDB
  • TigerGraph
  • HarperDB
  • SAP ASE (Sybase)

Legacy Databases

  • Teradata
  • Netezza
  • Informix
  • Ingres
  • Firebird
  • Progress OpenEdge
  • ParAccel
  • MaxDB
  • HP Vertica
  • Sybase IQ

Emerging Cloud & Hybrid Databases

  • PlanetScale (MySQL-based)
  • YugabyteDB
  • TiDB
  • OceanBase
  • Citus (PostgreSQL-based)
  • Snowplow Analytics
  • Spanner (Google Cloud)
  • MariaDB ColumnStore
  • CockroachDB Serverless
  • Weaviate (Vector Search)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mkpipe-0.3.5.tar.gz (26.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mkpipe-0.3.5-py3-none-any.whl (34.6 kB view details)

Uploaded Python 3

File details

Details for the file mkpipe-0.3.5.tar.gz.

File metadata

  • Download URL: mkpipe-0.3.5.tar.gz
  • Upload date:
  • Size: 26.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for mkpipe-0.3.5.tar.gz
Algorithm Hash digest
SHA256 a57e39d098098ce8cbc333ed4133440264d0357b336af07b3c61cdf21975f1d4
MD5 3e57d48e9d3f3ef3af3f73d97e5cdbe8
BLAKE2b-256 d3614d2345b382a597eb29376b5ee0ad46dcb3d5eca6a8d9de5772f3a6c3c7ee

See more details on using hashes here.

File details

Details for the file mkpipe-0.3.5-py3-none-any.whl.

File metadata

  • Download URL: mkpipe-0.3.5-py3-none-any.whl
  • Upload date:
  • Size: 34.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for mkpipe-0.3.5-py3-none-any.whl
Algorithm Hash digest
SHA256 af985b7bc6df5d0b1df407ae6cdad42f05abfab5d99dbe519446b0f6fdcc6d6a
MD5 3aeabf0f536b92667cd3487e36feacd3
BLAKE2b-256 f6d6631e5b6ff68c12016cfb60f6a4cc7012f6e21fc05fa1720f73d2fd112cfe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page