Skip to main content

Flow-Disentangled Feature Importance

Project description

FDFI - Flow-Disentangled Feature Importance

License: MIT Python 3.8+ PyPI

A Python library for computing feature importance using disentangled methods, inspired by SHAP.

Current release: 0.0.2

Overview

FDFI (Flow-Disentangled Feature Importance) is a Python module that provides interpretable machine learning explanations through disentangled feature importance methods. This package implements both DFI (Disentangled Feature Importance) and FDFI (Flow-DFI) methods. Similar to SHAP, FDFI helps you understand which features are driving your model's predictions.

Features

  • ๐ŸŽฏ Multiple Explainer Types: Tree, Linear, and Kernel explainers for different model types
  • ๐Ÿงญ OT-Based DFI: Gaussian OT (OTExplainer) and Entropic OT (EOTExplainer)
  • ๐ŸŒŠ Flow-DFI: FlowExplainer with CPI and SCPI methods for non-Gaussian data
  • ๐Ÿ“Š Rich Visualizations: Summary, waterfall, force, and dependence plots
  • ๐Ÿ”ง Easy to Use: Simple API similar to SHAP
  • ๐Ÿš€ Extensible: Built with modularity in mind for future enhancements

Installation

From Source

git clone https://github.com/jaydu1/FDFI.git
cd FDFI
pip install -e .

Dependencies

Use pyproject.toml extras:

pip install -e ".[dev]"
pip install -e ".[plots]"
pip install -e ".[flow]"

Quick Start

import numpy as np
from fdfi.explainers import OTExplainer

# Define your model
def model(X):
    return X.sum(axis=1)

# Create background data
X_background = np.random.randn(100, 10)

# Create an explainer
explainer = OTExplainer(model, data=X_background, nsamples=50)

# Explain test instances
X_test = np.random.randn(10, 10)
results = explainer(X_test)

# Confidence intervals (post-hoc)
ci = explainer.conf_int(alpha=0.05, target="X", alternative="two-sided")

CI Defaults in v0.0.2

By default, conf_int() now uses:

  • var_floor_method="mixture"
  • margin_method="mixture"

This improves stability for weak effects and avoids ad hoc thresholding in many use cases. You can still override both methods explicitly if needed.

EOT Options (Entropic OT)

EOTExplainer supports adaptive epsilon, stochastic transport sampling, and Gaussian/empirical targets:

from fdfi.explainers import EOTExplainer

explainer = EOTExplainer(
    model.predict,
    X_background,
    auto_epsilon=True,
    stochastic_transport=True,
    n_transport_samples=10,
    target="gaussian",  # or "empirical"
)
results = explainer(X_test)

Flow-DFI with FlowExplainer

FlowExplainer uses normalizing flows for non-Gaussian data, supporting both CPI (Conditional Permutation Importance) and SCPI (Sobol-CPI):

  • CPI: Average predictions first, then squared difference: $(Y - E[f(\tilde{X})])^2$
  • SCPI: Squared differences first, then average: $E[(Y - f(\tilde{X}_b))^2]$
from fdfi.explainers import FlowExplainer

# Create explainer with CPI (default)
explainer = FlowExplainer(
    model.predict,
    X_background,
    fit_flow=True,
    method='cpi',     # 'cpi', 'scpi', or 'both'
    num_steps=200,    # flow training steps
    nsamples=50,      # counterfactual samples
    sampling_method='resample',  # 'resample', 'permutation', 'normal', 'condperm'
)

results = explainer(X_test)
# results['phi_Z']: Z-space importance
# results['phi_X']: same as phi_Z (Z-space methods)

# Confidence intervals
ci = explainer.conf_int(alpha=0.05, target="Z", alternative="two-sided")

Explainer diagnostics (new in v0.0.2)

Disentangled explainers (OTExplainer, EOTExplainer, and FlowExplainer) report two diagnostics with qualitative labels (GOOD / MODERATE / POOR) using consistent [FDFI][DIAG] logging:

  • Latent independence (median dCor) โ€” lower is better (thresholds: <0.10 good, <0.25 moderate).
  • Distribution fidelity (MMD) โ€” lower is better (thresholds: <0.05 good, <0.15 moderate).

Example log:

[FDFI][DIAG] Flow Model Diagnostics
[FDFI][DIAG] Latent independence (median dCor): 0.0421 [GOOD]  โ†’ lower is better
[FDFI][DIAG] Distribution fidelity (MMD):       0.0187 [GOOD]  โ†’ lower is better

Access diagnostics directly:

diag = explainer.diagnostics
print(diag["latent_independence_median"], diag["latent_independence_label"])
print(diag["distribution_fidelity_mmd"], diag["distribution_fidelity_label"])

For advanced users, flow models can be trained separately:

from fdfi.models import FlowMatchingModel

# Train flow model externally
flow_model = FlowMatchingModel(X_background, dim=X_background.shape[1])
flow_model.fit(num_steps=500, verbose='final')

# Set pre-trained flow
explainer = FlowExplainer(model.predict, X_background, fit_flow=False)
explainer.set_flow(flow_model)

Project Structure

FDFI/
โ”œโ”€โ”€ fdfi/                  # Main package directory
โ”‚   โ”œโ”€โ”€ __init__.py       # Package initialization
โ”‚   โ”œโ”€โ”€ explainers.py     # Explainer classes
โ”‚   โ”œโ”€โ”€ plots.py          # Visualization functions
โ”‚   โ””โ”€โ”€ utils.py          # Utility functions
โ”œโ”€โ”€ tests/                 # Test suite
โ”‚   โ”œโ”€โ”€ test_explainers.py
โ”‚   โ”œโ”€โ”€ test_plots.py
โ”‚   โ””โ”€โ”€ test_utils.py
โ”œโ”€โ”€ docs/                  # Documentation & tutorials
โ”‚   โ””โ”€โ”€ tutorials/        # Jupyter notebook tutorials
โ”œโ”€โ”€ pyproject.toml        # Package configuration
โ””โ”€โ”€ README.md            # This file

Development Status

๐Ÿšง This is starter code for DFI development. The core structure and API are in place, but full implementations are coming soon.

Current status:

  • โœ… Package structure established
  • โœ… Base classes and interfaces defined
  • โœ… Testing framework set up
  • โœ… Documentation structure created
  • ๐Ÿšง Core algorithms (in development)
  • ๐Ÿšง Visualization functions (in development)

Testing

Run the test suite:

# Install development dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Run tests with coverage
pytest --cov=fdfi --cov-report=html

Documentation

Full documentation and tutorials are available in the docs/ directory:

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

References

FDFI is based on:

  • Du, J.-H., Roeder, K., & Wasserman, L. (2025). Disentangled Feature Importance. arXiv preprint arXiv:2507.00260.
  • Chen, X., Guo, Y., & Du, J.-H. (2026). Flow-Disentangled Feature Importance. In The Thirteenth International Conference on Learning Representations (ICLR).

Related work:

  • SHAP: A game theoretic approach to explain machine learning models

Citation

If you use DFI in your research, please cite:

@software{dfi2026,
  title={DFI: Python Library for Disentangled Feature Importance},
  author={DFI Team},
  year={2026},
  url={https://github.com/jaydu1/FDFI}
}

@article{du2025disentangled,
  title={Disentangled Feature Importance},
  author={Du, Jin-Hong and Roeder, Kathryn and Wasserman, Larry},
  journal={arXiv preprint arXiv:2507.00260},
  year={2025}
}

@inproceedings{chen2026flow,
  title={Flow-Disentangled Feature Importance},
  author={Chen, Xin and Guo, Yifan and Du, Jin-Hong},
  booktitle={The Thirteenth International Conference on Learning Representations},
  year={2026}
}

Contact

For questions and issues, please use the GitHub issue tracker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fdfi-0.0.2.tar.gz (38.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fdfi-0.0.2-py3-none-any.whl (28.7 kB view details)

Uploaded Python 3

File details

Details for the file fdfi-0.0.2.tar.gz.

File metadata

  • Download URL: fdfi-0.0.2.tar.gz
  • Upload date:
  • Size: 38.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fdfi-0.0.2.tar.gz
Algorithm Hash digest
SHA256 892f243cb8f3edf39a54f3caad00199628b157e64bdbd134ebb89b5dfa4e05a3
MD5 b1784623ecbb85d94d2e4d0eb4432255
BLAKE2b-256 b6e5cc8053c0b3aa00e5065ab0882f98c2503177a16610242cff8f5f8ea8a01c

See more details on using hashes here.

Provenance

The following attestation bundles were made for fdfi-0.0.2.tar.gz:

Publisher: publish.yml on jaydu1/FDFI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fdfi-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: fdfi-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 28.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fdfi-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e5fbce1d72759dbf2770f82437f5929e2892bee5d00b4aa544f8ffeaffb97a52
MD5 25836eae49b5ac36c0d91462ff3d0692
BLAKE2b-256 ba034ae13c1e9cc95261ec28a6211c7177448f8cd5ad402cf968d25849205624

See more details on using hashes here.

Provenance

The following attestation bundles were made for fdfi-0.0.2-py3-none-any.whl:

Publisher: publish.yml on jaydu1/FDFI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page