Skip to main content

A Polars plugin providing a 'hopper' of expressions for automatic, schema-aware application.

Project description

polars-expr-hopper

uv pdm-managed PyPI Supported Python versions License pre-commit.ci status

Polars plugin providing an “expression hopper”—a flexible, DataFrame-level container of Polars expressions (pl.Expr) that apply themselves as soon as the relevant columns are available.

Powered by polars-config-meta for persistent DataFrame-level metadata.

Simplify data pipelines by storing your expressions in a single location and letting them apply as soon as the corresponding columns exist in the DataFrame schema.

Installation

pip install polars-expr-hopper

The polars dependency is required but not included in the package by default. It is shipped as an optional extra which can be activated by passing it in square brackets:

pip install polars-expr-hopper[polars]           # for standard Polars
pip install polars-expr-hopper[polars-lts-cpu]   # for older CPUs

Requirements

  • Python 3.9+
  • Polars (any recent version, installed via [polars] or [polars-lts-cpu] extras)
  • (Optional) pyarrow if you want Parquet I/O features that preserve metadata in the hopper

Features

  • DataFrame-Level Expression Management: Store multiple Polars expressions on a DataFrame via the .hopper namespace.
  • Apply When Ready: Each expression is automatically applied once the DataFrame has all columns required by that expression.
  • Namespace Plugin: Access everything through df.hopper.*(...)—no subclassing or monkey-patching.
  • Metadata Preservation: Transformations called through df.hopper.<method>() keep the same expression hopper on the new DataFrame.
  • No Central Orchestration: Avoid fiddly pipeline step names or schemas—just attach your expressions once, and they get applied in the right order automatically.
  • Optional Serialisation: If you want to store or share expressions across runs (e.g., Parquet round-trip), you can serialise them to JSON or binary and restore them later—without forcing overhead in normal usage.

Usage

Basic Usage Example

import polars as pl
import polars_hopper  # This registers the .hopper plugin under pl.DataFrame

# Create an initial DataFrame
df = pl.DataFrame({
    "user_id": [1, 2, 3, 0],
    "name": ["Alice", "Bob", "Charlie", "NullUser"]
})

# Add expressions to the hopper:
#  - This one is valid right away: pl.col("user_id") != 0
#  - Another needs a future 'age' column
df.hopper.add_filter(pl.col("user_id") != 0)
df.hopper.add_filter(pl.col("age") > 18)  # 'age' doesn't exist yet

# Apply what we can; the first expression is immediately valid:
df = df.hopper.apply_ready_filters()
print(df)
# Rows with user_id=0 are dropped.

# Now let's do a transformation that adds an 'age' column.
# By calling df.hopper.with_columns(...), the plugin
# automatically copies the hopper metadata to the new DataFrame.
df2 = df.hopper.with_columns(
    pl.Series("age", [25, 15, 30])  # new column
)

# Now the second expression can be applied:
df2 = df2.hopper.apply_ready_filters()
print(df2)
# Only rows with age > 18 remain. That expression is then removed from the hopper.

How It Works

Below is a combined “How It Works” section that merges the key points from both previous versions, ensuring a cohesive explanation of the internal mechanism:


How It Works

Internally, polars-expr-hopper attaches a small “manager” object (a plugin namespace) to each DataFrame. This manager leverages polars-config-meta to store data in df.config_meta.get_metadata(), keyed by the id(df).

  1. List of In-Memory Expressions:

    • Maintains a hopper_filters list of Polars expressions (pl.Expr) in the DataFrame’s metadata.
    • Avoids Python callables or lambdas so that .meta.root_names() can be used for schema checks and optional serialisation is possible.
  2. Automatic Column Check (apply_ready_filters())

    • On apply_ready_filters(), each expression’s required columns (via .meta.root_names()) are compared to the current DataFrame schema.
    • Expressions referencing missing columns remain pending.
    • Expressions referencing all present columns are applied via df.filter(expr).
    • Successfully applied expressions are removed from the hopper.
  3. Metadata Preservation

    • Because we rely on polars-config-meta, transformations called through df.hopper.select(...), df.hopper.with_columns(...), etc. automatically copy the same hopper_filters list to the new DataFrame.
    • This ensures pending expressions remain valid throughout your pipeline until their columns finally appear.
  4. No Monkey-Patching

    • Polars’ plugin system is used, so there is no monkey-patching of core Polars classes.
    • The plugin registers a .hopper namespace—just like df.config_meta, but specialised for expression management.

Together, these features allow you to:

  • store a set of Polars expressions in one place
  • apply them as soon as their required columns exist
  • easily carry them forward through the pipeline

All without global orchestration or repeated expression checks.

This was motivated by wanting a way to make a flexible CLI tool and express filters for the results at different steps, without a proliferation of CLI flags. From there, the idea of a 'queue' which was pulled from on demand, in FIFO order but on the condition that the schema must be amenable was born.

This idea could be extended to select statements, but initially filtering was the primary deliverable.

API Methods

  • add_filter(predicate: Callable[[pl.DataFrame], pl.Series]) Add a new predicate (lambda, function, Polars expression, etc.) to the hopper.

  • apply_ready_filters() -> pl.DataFrame Check each stored expression’s root names. If the columns exist, df.filter(expr) is applied. Successfully applied expressions are removed.

  • list_filters() -> List[pl.Expr] Inspect the still-pending expressions in the hopper.

  • serialise_filters(format="binary"|"json") -> List[str|bytes] Convert expressions to JSON strings or binary bytes.

  • deserialise_filters(serialised_list, format="binary"|"json") Re-create in-memory pl.Expr objects from the serialised data, overwriting any existing expressions.

Contributing

Maintained by Louis Maddox. Contributions welcome!

  1. Issues & Discussions: Please open a GitHub issue or discussion for bugs, feature requests, or questions.
  2. Pull Requests: PRs are welcome!
    • Install the dev extra (e.g. with uv): uv pip install -e .[dev]
    • Run tests (when available) and include updates to docs or examples if relevant.
    • If reporting a bug, please include the version and any error messages/tracebacks.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polars_expr_hopper-0.1.2.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

polars_expr_hopper-0.1.2-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file polars_expr_hopper-0.1.2.tar.gz.

File metadata

  • Download URL: polars_expr_hopper-0.1.2.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.3 CPython/3.11.11 Linux/6.8.0-51-generic

File hashes

Hashes for polars_expr_hopper-0.1.2.tar.gz
Algorithm Hash digest
SHA256 aa2952e2f725fca530d4bd86d31ea7214fa0ccd81e2b60899b7b68c9a546dd70
MD5 478731d3cdd4f3be983420ddfda44ba4
BLAKE2b-256 471a4197ac825d690e380811a326d60f422c0b02045c9acf12826efc47a83525

See more details on using hashes here.

File details

Details for the file polars_expr_hopper-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: polars_expr_hopper-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.22.3 CPython/3.11.11 Linux/6.8.0-51-generic

File hashes

Hashes for polars_expr_hopper-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f4eb9d916a0e49b3467118c13abd0d0c7749c26933a54856074aeda64701b22c
MD5 9af42a3abff7dc613f0ef30fef18511a
BLAKE2b-256 3dfae63b1ab11365cbf71abd7f21c27edbf0d5559b67eac457833d544d71c482

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page