Skip to main content

Python APIs for using Delta Lake with Apache Spark

Project description

Delta Lake

Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs.

This PyPi package contains the Python APIs for using Delta Lake with Apache Spark.

Installation and usage

  1. Install using pip install delta-spark
  2. To use the Delta Lake with Apache Spark, you have to set additional configurations when creating the SparkSession. See the online project web page for details.

Documentation

This README file only contains basic information related to pip installed Delta Lake. You can find the full documentation on the project web page

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

delta-spark-3.1.0.tar.gz (21.9 kB view hashes)

Uploaded Source

Built Distribution

delta_spark-3.1.0-py3-none-any.whl (21.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page