Skip to main content

Flow-Disentangled Feature Importance

Project description

FDFI - Flow-Disentangled Feature Importance

License: MIT Python 3.8+ PyPI PyPI Downloads

A Python library for computing feature importance using disentangled methods, inspired by SHAP.

Current release: 0.0.2

Overview

FDFI (Flow-Disentangled Feature Importance) is a Python module that provides interpretable machine learning explanations through disentangled feature importance methods. This package implements both DFI (Disentangled Feature Importance) and FDFI (Flow-DFI) methods. Similar to SHAP, FDFI helps you understand which features are driving your model's predictions.

Features

  • ๐ŸŽฏ Multiple Explainer Types: Tree, Linear, and Kernel explainers for different model types
  • ๐Ÿงญ OT-Based DFI: Gaussian OT (OTExplainer) and Entropic OT (EOTExplainer)
  • ๐ŸŒŠ Flow-DFI: FlowExplainer with CPI and SCPI methods for non-Gaussian data
  • ๐Ÿ“Š Rich Visualizations: Summary, waterfall, force, and dependence plots
  • ๐Ÿ”ง Easy to Use: Simple API similar to SHAP
  • ๐Ÿš€ Extensible: Built with modularity in mind for future enhancements

Installation

From Source

git clone https://github.com/jaydu1/FDFI.git
cd FDFI
pip install -e .

Dependencies

Use pyproject.toml extras:

pip install -e ".[dev]"
pip install -e ".[plots]"
pip install -e ".[flow]"

Quick Start

import numpy as np
from fdfi.explainers import OTExplainer

# Define your model
def model(X):
    return X.sum(axis=1)

# Create background data
X_background = np.random.randn(100, 10)

# Create an explainer
explainer = OTExplainer(model, data=X_background, nsamples=50)

# Explain test instances
X_test = np.random.randn(10, 10)
results = explainer(X_test)

# Confidence intervals (post-hoc)
ci = explainer.conf_int(alpha=0.05, target="X", alternative="two-sided")

CI Defaults in v0.0.2

By default, conf_int() now uses:

  • var_floor_method="mixture"
  • margin_method="mixture"

This improves stability for weak effects and avoids ad hoc thresholding in many use cases. You can still override both methods explicitly if needed.

EOT Options (Entropic OT)

EOTExplainer supports adaptive epsilon, stochastic transport sampling, and Gaussian/empirical targets:

from fdfi.explainers import EOTExplainer

explainer = EOTExplainer(
    model.predict,
    X_background,
    auto_epsilon=True,
    stochastic_transport=True,
    n_transport_samples=10,
    target="gaussian",  # or "empirical"
)
results = explainer(X_test)

Flow-DFI with FlowExplainer

FlowExplainer uses normalizing flows for non-Gaussian data, supporting both CPI (Conditional Permutation Importance) and SCPI (Sobol-CPI):

  • CPI: Average predictions first, then squared difference: $(Y - E[f(\tilde{X})])^2$
  • SCPI: Squared differences first, then average: $E[(Y - f(\tilde{X}_b))^2]$
from fdfi.explainers import FlowExplainer

# Create explainer with CPI (default)
explainer = FlowExplainer(
    model.predict,
    X_background,
    fit_flow=True,
    method='cpi',     # 'cpi', 'scpi', or 'both'
    num_steps=200,    # flow training steps
    nsamples=50,      # counterfactual samples
    sampling_method='resample',  # 'resample', 'permutation', 'normal', 'condperm'
)

results = explainer(X_test)
# results['phi_Z']: Z-space importance
# results['phi_X']: same as phi_Z (Z-space methods)

# Confidence intervals
ci = explainer.conf_int(alpha=0.05, target="Z", alternative="two-sided")

Explainer diagnostics (new in v0.0.2)

Disentangled explainers (OTExplainer, EOTExplainer, and FlowExplainer) report two diagnostics with qualitative labels (GOOD / MODERATE / POOR) using consistent [FDFI][DIAG] logging:

  • Latent independence (median dCor) โ€” lower is better (thresholds: <0.10 good, <0.25 moderate).
  • Distribution fidelity (MMD) โ€” lower is better (thresholds: <0.05 good, <0.15 moderate).

Example log:

[FDFI][DIAG] Flow Model Diagnostics
[FDFI][DIAG] Latent independence (median dCor): 0.0421 [GOOD]  โ†’ lower is better
[FDFI][DIAG] Distribution fidelity (MMD):       0.0187 [GOOD]  โ†’ lower is better

Access diagnostics directly:

diag = explainer.diagnostics
print(diag["latent_independence_median"], diag["latent_independence_label"])
print(diag["distribution_fidelity_mmd"], diag["distribution_fidelity_label"])

For advanced users, flow models can be trained separately:

from fdfi.models import FlowMatchingModel

# Train flow model externally
flow_model = FlowMatchingModel(X_background, dim=X_background.shape[1])
flow_model.fit(num_steps=500, verbose='final')

# Set pre-trained flow
explainer = FlowExplainer(model.predict, X_background, fit_flow=False)
explainer.set_flow(flow_model)

Project Structure

FDFI/
โ”œโ”€โ”€ fdfi/                  # Main package directory
โ”‚   โ”œโ”€โ”€ __init__.py       # Package initialization
โ”‚   โ”œโ”€โ”€ explainers.py     # Explainer classes
โ”‚   โ”œโ”€โ”€ plots.py          # Visualization functions
โ”‚   โ””โ”€โ”€ utils.py          # Utility functions
โ”œโ”€โ”€ tests/                 # Test suite
โ”‚   โ”œโ”€โ”€ test_explainers.py
โ”‚   โ”œโ”€โ”€ test_plots.py
โ”‚   โ””โ”€โ”€ test_utils.py
โ”œโ”€โ”€ docs/                  # Documentation & tutorials
โ”‚   โ””โ”€โ”€ tutorials/        # Jupyter notebook tutorials
โ”œโ”€โ”€ pyproject.toml        # Package configuration
โ””โ”€โ”€ README.md            # This file

Development Status

๐Ÿšง This is starter code for DFI development. The core structure and API are in place, but full implementations are coming soon.

Current status:

  • โœ… Package structure established
  • โœ… Base classes and interfaces defined
  • โœ… Testing framework set up
  • โœ… Documentation structure created
  • ๐Ÿšง Core algorithms (in development)
  • ๐Ÿšง Visualization functions (in development)

Testing

Run the test suite:

# Install development dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Run tests with coverage
pytest --cov=fdfi --cov-report=html

Documentation

Full documentation and tutorials are available in the docs/ directory:

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

References

FDFI is based on:

  • Du, J.-H., Roeder, K., & Wasserman, L. (2025). Disentangled Feature Importance. arXiv preprint arXiv:2507.00260.
  • Chen, X., Guo, Y., & Du, J.-H. (2026). Flow-Disentangled Feature Importance. In The Thirteenth International Conference on Learning Representations (ICLR).

Related work:

  • SHAP: A game theoretic approach to explain machine learning models

Citation

If you use DFI in your research, please cite:

@software{dfi2026,
  title={DFI: Python Library for Disentangled Feature Importance},
  author={DFI Team},
  year={2026},
  url={https://github.com/jaydu1/FDFI}
}

@article{du2025disentangled,
  title={Disentangled Feature Importance},
  author={Du, Jin-Hong and Roeder, Kathryn and Wasserman, Larry},
  journal={arXiv preprint arXiv:2507.00260},
  year={2025}
}

@inproceedings{chen2026flow,
  title={Flow-Disentangled Feature Importance},
  author={Chen, Xin and Guo, Yifan and Du, Jin-Hong},
  booktitle={The Thirteenth International Conference on Learning Representations},
  year={2026}
}

Contact

For questions and issues, please use the GitHub issue tracker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fdfi-0.0.4.tar.gz (44.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fdfi-0.0.4-py3-none-any.whl (32.1 kB view details)

Uploaded Python 3

File details

Details for the file fdfi-0.0.4.tar.gz.

File metadata

  • Download URL: fdfi-0.0.4.tar.gz
  • Upload date:
  • Size: 44.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fdfi-0.0.4.tar.gz
Algorithm Hash digest
SHA256 5b6569dce2446591c5643094f9f2b868a6cf8e9b91c2286316333367efccc635
MD5 90f6b67d9b47956bc5345b08804d8e27
BLAKE2b-256 5027777c600509f3c1935f2a4a5dd4fdc6dd1a593dfe1dfdd8333383e99e6062

See more details on using hashes here.

Provenance

The following attestation bundles were made for fdfi-0.0.4.tar.gz:

Publisher: publish.yml on jaydu1/FDFI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fdfi-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: fdfi-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 32.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fdfi-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c8ce7fb82eda4ce139c9584d0809c68d3cc9429f9f01d9e710e2cf935878e843
MD5 6ad05f52e959d565dd0e43fdbf1b537a
BLAKE2b-256 39882820ff2b353ff8297529f3def3d4359510ff11ba77da28801aebe6069318

See more details on using hashes here.

Provenance

The following attestation bundles were made for fdfi-0.0.4-py3-none-any.whl:

Publisher: publish.yml on jaydu1/FDFI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page