Skip to main content

A pydantic -> spark schema library

Project description

SparkDantic

1️⃣ version: 0.1.0

✍️ author: Mitchell Lisle

PySpark Model Conversion Tool

This Python module provides a utility for converting Pydantic models to PySpark schemas. It's implemented as a class named SparkModel that extends the Pydantic's BaseModel.

Features

  • Conversion from Pydantic model to PySpark schema.
  • Determination of nullable types.
  • Customizable type mapping between Python and PySpark data types.

Dependencies

This module aims to have a small dependency footprint:

  • pydantic
  • pyspark
  • Python's built-in datetime, decimal, types, and typing modules

Usage

Creating a new SparkModel

A SparkModel is a Pydantic model, and you can define one by simply inheriting from SparkModel and defining some fields:

from sparkdantic import SparkModel
from typing import List

class MyModel(SparkModel):
    name: str
    age: int
    hobbies: List[str]

Generating a PySpark Schema

Pydantic has existing models for generating json schemas (with model_json_schema). With a SparkModel you can generate a PySpark schema from the model fields using the model_spark_schema() method:

my_model = MyModel()
spark_schema = my_model.model_spark_schema()

Provides this schema:

StructType([
    StructField('name', StringType(), True),
    StructField('age', IntegerType(), True),
    StructField('hobbies', ArrayType(StringType(), False), True)
])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sparkdantic-0.1.0.tar.gz (2.9 kB view hashes)

Uploaded Source

Built Distribution

sparkdantic-0.1.0-py3-none-any.whl (3.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page