Skip to main content

PySpark anti-pattern linter — catch the code that costs you money

Project description

cylint

A PySpark linter that catches the anti-patterns costing you real money.

Static analysis for PySpark code. No Spark runtime needed. Zero dependencies. Runs anywhere Python runs.

Install

pip install cylint

Usage

# Lint files or directories
cy lint src/pipelines/

# JSON output for CI
cy lint --format json src/

# Only warnings and critical
cy lint --min-severity warning .

Example output:

pipeline.py:47:8: CY003 [critical] .withColumn() inside a loop creates O(n²) plan complexity.
  Use .select([...]) with all column expressions instead.

pipeline.py:82:4: CY001 [warning] .collect() called without filtering.
  Consider .limit(N).collect(), .take(N), or using .show() for inspection.

pipeline.py:103:4: CY005 [warning] .cache() with single downstream use.
  Cache is only beneficial when the same DataFrame is used in multiple actions.

Found 3 issues (1 critical, 2 warnings) in 1 file.

Rules

Rule Severity What it catches
CY001 warning .collect() without .filter() or .limit() — the #1 OOM cause
CY002 warning UDF where a builtin exists (e.g. udf(lambda x: x.lower())F.lower())
CY003 critical .withColumn() in a loop — creates O(n²) Catalyst plans
CY004 info SELECT * in spark.sql() strings — prevents column pruning
CY005 warning .cache() / .persist() with ≤1 downstream use — wastes memory
CY006 warning .toPandas() on unfiltered DataFrame — collects everything to driver
CY007 critical .crossJoin() or .join() without condition — cartesian product
CY008 info .repartition() before .write() — unnecessary shuffle
CY009 critical UDF in .filter()/.where() — blocks predicate pushdown
CY010 warning .join() without explicit how= — ambiguous join type
CY011 warning .withColumnRenamed()/.drop() in a loop — O(n²) plan nodes
CY012 warning .show()/.display()/.printSchema() left in production code
CY013 warning .coalesce(1) before .write() — single-executor bottleneck
CY014 critical Multiple actions without .cache() — recomputes full lineage each time
CY015 critical Non-equi .join() condition — implicit cartesian product
CY016 info Invalid escape sequence in string literal — use raw strings for regex

List all rules:

cy rules

How it works

cylint uses Python's ast module to parse your source files and track DataFrame variables through assignment chains. It knows that anything coming from spark.read.*, spark.sql(), or spark.table() is a DataFrame, and follows method chains from there.

No type stubs. No Spark installation. No imports resolved. Just fast, heuristic analysis that catches the patterns that matter.

Configuration

Out of the box, every rule runs at its default severity with no exclusions. No config file needed.

If a rule doesn't apply to your codebase, or you want to skip certain directories, drop a .cylint.yml in your project root or add a [tool.cylint] section to your existing pyproject.toml. The linter picks it up automatically.

.cylint.yml

# Only fail on warnings and above (ignore info-level findings)
min-severity: warning

rules:
  CY004: off        # we use SELECT * intentionally in dynamic queries
  CY008: warning    # promote repartition-before-write to warning

exclude:
  - tests/
  - vendor/
  - notebooks/scratch/

pyproject.toml

[tool.cylint]
min-severity = "warning"
exclude = ["tests/", "notebooks/scratch/"]

[tool.cylint.rules]
CY004 = "off"
CY008 = "warning"

CI Integration

GitHub Actions

name: PySpark Lint
on: pull_request

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: "3.12"
      - run: pip install cylint
      - run: cy lint --format github src/

The --format github flag outputs findings as workflow annotations — they appear inline on the PR diff.

pre-commit

repos:
  - repo: https://github.com/clusteryield/cylint
    hooks:
      - id: spark-lint
        args: [--min-severity, warning]

Exit codes

Code Meaning
0 No findings
1 Warnings or info findings
2 Critical findings

Why these rules?

Every rule targets a pattern that either causes OOM crashes, triggers unnecessary shuffles, or prevents Spark's Catalyst optimizer from doing its job. These aren't style opinions — they're the patterns you find in postmortems after a 3am page about a failed pipeline or a $40K surprise on your Databricks bill.

If you've read a "PySpark anti-patterns to avoid" blog post, you've seen these patterns described. This tool catches them automatically, before the code hits production.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cylint-0.1.2.tar.gz (82.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cylint-0.1.2-py3-none-any.whl (52.1 kB view details)

Uploaded Python 3

File details

Details for the file cylint-0.1.2.tar.gz.

File metadata

  • Download URL: cylint-0.1.2.tar.gz
  • Upload date:
  • Size: 82.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cylint-0.1.2.tar.gz
Algorithm Hash digest
SHA256 1f44a65463aea075266d4d3b06e4c8047a960490b27481cb6e95a53ef489bfcf
MD5 5905a1818dc320249c7bf5a4673ff493
BLAKE2b-256 fdf74d31433ee65a895a4249456ac5e9a52a4bd5ec825f068de23a8974be08f2

See more details on using hashes here.

File details

Details for the file cylint-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: cylint-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 52.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cylint-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 617c0782493b48420cdc626ce9cf89906bb2d4660c5ae1e10b99672060542a34
MD5 f4995695e3d695b97f0d985a5913a565
BLAKE2b-256 c3ffdf12dc2a4d0bd9894b61e5daa49dd5ecefb5b5d9271c9dfa14e65f93d07d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page