Skip to main content

A Polars plugin providing a 'hopper' of expressions for automatic, schema-aware application.

Project description

polars-expr-hopper

uv pdm-managed PyPI Supported Python versions License pre-commit.ci status

Polars plugin providing an “expression hopper”—a flexible, DataFrame-level container of Polars expressions (pl.Expr) that apply themselves as soon as the relevant columns are available.

Powered by polars-config-meta for persistent DataFrame-level metadata.

Simplify data pipelines by storing your expressions in a single location and letting them apply as soon as the corresponding columns exist in the DataFrame schema.

Installation

pip install polars-expr-hopper

The polars dependency is required but not included in the package by default. It is shipped as an optional extra which can be activated by passing it in square brackets:

pip install polars-expr-hopper[polars]           # for standard Polars
pip install polars-expr-hopper[polars-lts-cpu]   # for older CPUs

Requirements

  • Python 3.9+
  • Polars (any recent version, installed via [polars] or [polars-lts-cpu] extras)
  • (Optional) pyarrow if you want Parquet I/O features that preserve metadata in the hopper

Features

  • DataFrame-Level Expression Management: Store multiple Polars expressions on a DataFrame via the .hopper namespace.
  • Apply When Ready: Each expression is automatically applied once the DataFrame has all columns required by that expression.
  • Namespace Plugin: Access everything through df.hopper.*(...)—no subclassing or monkey-patching.
  • Metadata Preservation: Transformations called through df.hopper.<method>() keep the same expression hopper on the new DataFrame.
  • No Central Orchestration: Avoid fiddly pipeline step names or schemas—just attach your expressions once, and they get applied in the right order automatically.
  • Optional Serialisation: If you want to store or share expressions across runs (e.g., Parquet round-trip), you can serialise them to JSON or binary and restore them later—without forcing overhead in normal usage.

Usage

Basic Usage Example

import polars as pl
import polars_hopper  # This registers the .hopper plugin under pl.DataFrame

# Create an initial DataFrame
df = pl.DataFrame({
    "user_id": [1, 2, 3, 0],
    "name": ["Alice", "Bob", "Charlie", "NullUser"]
})

# Add expressions to the hopper:
#  - This one is valid right away: pl.col("user_id") != 0
#  - Another needs a future 'age' column
df.hopper.add_filters(pl.col("user_id") != 0)
df.hopper.add_filters(pl.col("age") > 18)  # 'age' doesn't exist yet

# Apply what we can; the first expression is immediately valid:
df = df.hopper.apply_ready_filters()
print(df)
# Rows with user_id=0 are dropped.

# Now let's do a transformation that adds an 'age' column.
# By calling df.hopper.with_columns(...), the plugin
# automatically copies the hopper metadata to the new DataFrame.
df2 = df.hopper.with_columns(
    pl.Series("age", [25, 15, 30])  # new column
)

# Now the second expression can be applied:
df2 = df2.hopper.apply_ready_filters()
print(df2)
# Only rows with age > 18 remain. That expression is then removed from the hopper.

How It Works

Internally, polars-expr-hopper attaches a small “manager” object (a plugin namespace) to each DataFrame. This manager leverages polars-config-meta to store data in df.config_meta.get_metadata(), keyed by the id(df).

  1. List of In-Memory Expressions:

    • Maintains a hopper_filters list of Polars expressions (pl.Expr) in the DataFrame’s metadata.
    • Avoids Python callables or lambdas so that .meta.root_names() can be used for schema checks and optional serialisation is possible.
  2. Automatic Column Check (apply_ready_filters())

    • On apply_ready_filters(), each expression’s required columns (via .meta.root_names()) are compared to the current DataFrame schema.
    • Expressions referencing missing columns remain pending.
    • Expressions referencing all present columns are applied via df.filter(expr).
    • Successfully applied expressions are removed from the hopper.
  3. Metadata Preservation

    • Because we rely on polars-config-meta, transformations called through df.hopper.select(...), df.hopper.with_columns(...), etc. automatically copy the same hopper_filters list to the new DataFrame.
    • This ensures pending expressions remain valid throughout your pipeline until their columns finally appear.
  4. No Monkey-Patching

    • Polars’ plugin system is used, so there is no monkey-patching of core Polars classes.
    • The plugin registers a .hopper namespace—just like df.config_meta, but specialised for expression management.

Together, these features allow you to:

  • store a set of Polars expressions in one place
  • apply them as soon as their required columns exist
  • easily carry them forward through the pipeline

All without global orchestration or repeated expression checks.

This was motivated by wanting a way to make a flexible CLI tool and express filters for the results at different steps, without a proliferation of CLI flags. From there, the idea of a 'queue' which was pulled from on demand, in FIFO order but on the condition that the schema must be amenable was born.

This idea could be extended to select statements, but initially filtering was the primary deliverable.

API Methods

  • add_filters(*exprs: tuple[pl.Expr, ...]) Add a new predicate (lambda, function, Polars expression, etc.) to the hopper.

  • apply_ready_filters() -> pl.DataFrame Check each stored expression’s root names. If the columns exist, df.filter(expr) is applied. Successfully applied expressions are removed.

  • list_filters() -> List[pl.Expr] Inspect the still-pending expressions in the hopper.

  • serialise_filters(format="binary"|"json") -> List[str|bytes] Convert expressions to JSON strings or binary bytes.

  • deserialise_filters(serialised_list, format="binary"|"json") Re-create in-memory pl.Expr objects from the serialised data, overwriting any existing expressions.

Contributing

Maintained by Louis Maddox. Contributions welcome!

  1. Issues & Discussions: Please open a GitHub issue or discussion for bugs, feature requests, or questions.
  2. Pull Requests: PRs are welcome!
    • Install the dev extra (e.g. with uv): uv pip install -e .[dev]
    • Run tests (when available) and include updates to docs or examples if relevant.
    • If reporting a bug, please include the version and any error messages/tracebacks.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polars_expr_hopper-1.0.0.tar.gz (22.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

polars_expr_hopper-1.0.0-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file polars_expr_hopper-1.0.0.tar.gz.

File metadata

  • Download URL: polars_expr_hopper-1.0.0.tar.gz
  • Upload date:
  • Size: 22.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.3 CPython/3.11.11 Linux/6.8.0-51-generic

File hashes

Hashes for polars_expr_hopper-1.0.0.tar.gz
Algorithm Hash digest
SHA256 ea4065ae4fe79bca9727117b2f6f0e28cfe6fe39c1650b2096db9a08cfa0d565
MD5 be4c1ed80eaefb91af01559d9569b845
BLAKE2b-256 530a5514dee3de145a256ec8dc358c7f3138ebf4b89a3f6e59528d6cf6322d44

See more details on using hashes here.

File details

Details for the file polars_expr_hopper-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: polars_expr_hopper-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.3 CPython/3.11.11 Linux/6.8.0-51-generic

File hashes

Hashes for polars_expr_hopper-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 42eb6f8fcd26bc8a9c81777f4e5059147b9e2d186d81404606686a526921c95e
MD5 37a126f3c9961e9d58cad61e306e0020
BLAKE2b-256 1c08eb1b0a1fae8104a9a7ecb4afcf2baced0d52d0833a5e9b4b354e6b2ba7ab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page