Skip to main content

Convert Apache Spark Catalyst LogicalPlan JSON back into a logically equivalent SQL statement.

Project description

spark-plan-to-sql

Convert Apache Spark Catalyst LogicalPlan JSON dumps back into a logically equivalent SQL statement.

The converter walks the pre-order JSON serialization Spark produces for its LogicalPlan (the same shape exposed via df.queryExecution.logical.toJSON) and emits readable SQL that, when re-executed, returns the same rows as the original query.

Install

pip install spark-plan-to-sql

Python API

import json
from spark_plan_to_sql import plan_to_sql, dict_to_sql

# 1) Plain JSON string
sql = plan_to_sql('[{"class": "...OneRowRelation", "num-children": 0}]')

# 2) Already-parsed Python list (Spark's native format)
with open("plan.json") as f:
    sql = plan_to_sql(json.load(f))

# 3) A dict wrapper, e.g. {"plan": [...]} or {"logicalPlan": [...]}
sql = plan_to_sql({"plan": json.load(open("plan.json"))})

# 4) Strict dict-only helper
sql = dict_to_sql({"logicalPlan": [...]})

The function accepts:

  • str / bytes — JSON text
  • list[dict] — Spark's native pre-order plan list
  • dict — either a single leaf node or a wrapper such as {"plan": [...]} / {"logicalPlan": [...]} / {"nodes": [...]}

CLI

# Convert one or more files (prints to stdout)
spark-plan-to-sql plan.json plan2.json

# Batch convert a directory of plans into ./restored_sql/
spark-plan-to-sql --dir test_json --out restored_sql

Supported plan nodes

DDL/DML: CreateNamespace, DropNamespace, SetCatalogAndNamespace, CreateTable*, DropTable*, DropView, CreateViewCommand, CacheTable, UncacheTable, InsertInto*, AppendData.

Query: Project, Filter, Sort, GlobalLimit/LocalLimit, Distinct/Deduplicate, Aggregate (incl. ROLLUP/CUBE/GROUPING SETS through the Expand+spark_grouping_id pattern), Join (Inner / Left / Right / Full / LeftSemi / LeftAnti / Cross), Union/Intersect/Except, Window (with frame suppression for LAG/LEAD/RANK/ROW_NUMBER/...), Generate (LATERAL VIEW), WithCTE/CTERelationDef/CTERelationRef, SubqueryAlias, LogicalRelation, DataSourceV2Relation, LocalRelation, OneRowRelation.

Expressions: literals (incl. interval/decimal/date/timestamp), Cast, Alias, AttributeReference, OuterReference, binary/unary operators, CaseWhen/If, Coalesce/IfNull/Nvl, aggregates (Count/Sum/ Avg/...), datetime (Year/Month/AddMonths/DateAdd rewriting from ExtractANSIIntervalDays/...), windowing (WindowExpression/ WindowSpecDefinition/SpecifiedWindowFrame), array/map/struct (GetStructField/GetArrayItem/ElementAt/ArrayTransform/ LambdaFunction/CreateNamedStruct/MapFromArrays/...), JSON (JsonToStructs/GetJsonObject/StructsToJson).

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spark_plan_to_sql-0.1.0.tar.gz (23.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spark_plan_to_sql-0.1.0-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file spark_plan_to_sql-0.1.0.tar.gz.

File metadata

  • Download URL: spark_plan_to_sql-0.1.0.tar.gz
  • Upload date:
  • Size: 23.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for spark_plan_to_sql-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1041f17b98f616114e45832fe774a112d9eee5da6372ca3c0cc7957162a909c7
MD5 1ae25b4fbca0a4399ffce096a779db8f
BLAKE2b-256 06bf58432b76fc0000a137974bf125b336551a7d9d604ee96db2d92e18ab73bf

See more details on using hashes here.

File details

Details for the file spark_plan_to_sql-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for spark_plan_to_sql-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 71415f4a549fcaec914b6ed8813d6f7b73f0e3fd203834bc90b2163a00450003
MD5 8961b140f3535d4c95fd31b2530a1c26
BLAKE2b-256 ee823b6d439f350c5376a5b446de0c09c57b02949d05e59c88d7ff0a22a42e38

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page