Flow-Disentangled Feature Importance
Project description
FDFI - Flow-Disentangled Feature Importance
A Python library for computing feature importance using disentangled methods, inspired by SHAP.
Current release: 0.0.2
Overview
FDFI (Flow-Disentangled Feature Importance) is a Python module that provides interpretable machine learning explanations through disentangled feature importance methods. This package implements both DFI (Disentangled Feature Importance) and FDFI (Flow-DFI) methods. Similar to SHAP, FDFI helps you understand which features are driving your model's predictions.
Features
- ๐ฏ Multiple Explainer Types: Tree, Linear, and Kernel explainers for different model types
- ๐งญ OT-Based DFI: Gaussian OT (OTExplainer) and Entropic OT (EOTExplainer)
- ๐ Flow-DFI: FlowExplainer with CPI and SCPI methods for non-Gaussian data
- ๐ Rich Visualizations: Summary, waterfall, force, and dependence plots
- ๐ง Easy to Use: Simple API similar to SHAP
- ๐ Extensible: Built with modularity in mind for future enhancements
Installation
From Source
git clone https://github.com/jaydu1/FDFI.git
cd FDFI
pip install -e .
Dependencies
Use pyproject.toml extras:
pip install -e ".[dev]"
pip install -e ".[plots]"
pip install -e ".[flow]"
Quick Start
import numpy as np
from fdfi.explainers import OTExplainer
# Define your model
def model(X):
return X.sum(axis=1)
# Create background data
X_background = np.random.randn(100, 10)
# Create an explainer
explainer = OTExplainer(model, data=X_background, nsamples=50)
# Explain test instances
X_test = np.random.randn(10, 10)
results = explainer(X_test)
# Confidence intervals (post-hoc)
ci = explainer.conf_int(alpha=0.05, target="X", alternative="two-sided")
CI Defaults in v0.0.2
By default, conf_int() now uses:
var_floor_method="mixture"margin_method="mixture"
This improves stability for weak effects and avoids ad hoc thresholding in many use cases. You can still override both methods explicitly if needed.
EOT Options (Entropic OT)
EOTExplainer supports adaptive epsilon, stochastic transport sampling, and
Gaussian/empirical targets:
from fdfi.explainers import EOTExplainer
explainer = EOTExplainer(
model.predict,
X_background,
auto_epsilon=True,
stochastic_transport=True,
n_transport_samples=10,
target="gaussian", # or "empirical"
)
results = explainer(X_test)
Flow-DFI with FlowExplainer
FlowExplainer uses normalizing flows for non-Gaussian data, supporting both CPI (Conditional Permutation Importance) and SCPI (Sobol-CPI):
- CPI: Average predictions first, then squared difference: $(Y - E[f(\tilde{X})])^2$
- SCPI: Squared differences first, then average: $E[(Y - f(\tilde{X}_b))^2]$
from fdfi.explainers import FlowExplainer
# Create explainer with CPI (default)
explainer = FlowExplainer(
model.predict,
X_background,
fit_flow=True,
method='cpi', # 'cpi', 'scpi', or 'both'
num_steps=200, # flow training steps
nsamples=50, # counterfactual samples
sampling_method='resample', # 'resample', 'permutation', 'normal', 'condperm'
)
results = explainer(X_test)
# results['phi_Z']: Z-space importance
# results['phi_X']: same as phi_Z (Z-space methods)
# Confidence intervals
ci = explainer.conf_int(alpha=0.05, target="Z", alternative="two-sided")
Explainer diagnostics (new in v0.0.2)
Disentangled explainers (OTExplainer, EOTExplainer, and FlowExplainer) report two diagnostics with qualitative labels (GOOD / MODERATE / POOR) using consistent [FDFI][DIAG] logging:
- Latent independence (median dCor) โ lower is better (thresholds: <0.10 good, <0.25 moderate).
- Distribution fidelity (MMD) โ lower is better (thresholds: <0.05 good, <0.15 moderate).
Example log:
[FDFI][DIAG] Flow Model Diagnostics
[FDFI][DIAG] Latent independence (median dCor): 0.0421 [GOOD] โ lower is better
[FDFI][DIAG] Distribution fidelity (MMD): 0.0187 [GOOD] โ lower is better
Access diagnostics directly:
diag = explainer.diagnostics
print(diag["latent_independence_median"], diag["latent_independence_label"])
print(diag["distribution_fidelity_mmd"], diag["distribution_fidelity_label"])
For advanced users, flow models can be trained separately:
from fdfi.models import FlowMatchingModel
# Train flow model externally
flow_model = FlowMatchingModel(X_background, dim=X_background.shape[1])
flow_model.fit(num_steps=500, verbose='final')
# Set pre-trained flow
explainer = FlowExplainer(model.predict, X_background, fit_flow=False)
explainer.set_flow(flow_model)
Project Structure
FDFI/
โโโ fdfi/ # Main package directory
โ โโโ __init__.py # Package initialization
โ โโโ explainers.py # Explainer classes
โ โโโ plots.py # Visualization functions
โ โโโ utils.py # Utility functions
โโโ tests/ # Test suite
โ โโโ test_explainers.py
โ โโโ test_plots.py
โ โโโ test_utils.py
โโโ docs/ # Documentation & tutorials
โ โโโ tutorials/ # Jupyter notebook tutorials
โโโ pyproject.toml # Package configuration
โโโ README.md # This file
Development Status
๐ง This is starter code for DFI development. The core structure and API are in place, but full implementations are coming soon.
Current status:
- โ Package structure established
- โ Base classes and interfaces defined
- โ Testing framework set up
- โ Documentation structure created
- ๐ง Core algorithms (in development)
- ๐ง Visualization functions (in development)
Testing
Run the test suite:
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run tests with coverage
pytest --cov=fdfi --cov-report=html
Documentation
Full documentation and tutorials are available in the docs/ directory:
- Quickstart Tutorial
- OT Explainer Tutorial
- EOT Explainer Tutorial
- Flow Explainer Tutorial
- Confidence Intervals
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
References
FDFI is based on:
- Du, J.-H., Roeder, K., & Wasserman, L. (2025). Disentangled Feature Importance. arXiv preprint arXiv:2507.00260.
- Chen, X., Guo, Y., & Du, J.-H. (2026). Flow-Disentangled Feature Importance. In The Thirteenth International Conference on Learning Representations (ICLR).
Related work:
- SHAP: A game theoretic approach to explain machine learning models
Citation
If you use DFI in your research, please cite:
@software{dfi2026,
title={DFI: Python Library for Disentangled Feature Importance},
author={DFI Team},
year={2026},
url={https://github.com/jaydu1/FDFI}
}
@article{du2025disentangled,
title={Disentangled Feature Importance},
author={Du, Jin-Hong and Roeder, Kathryn and Wasserman, Larry},
journal={arXiv preprint arXiv:2507.00260},
year={2025}
}
@inproceedings{chen2026flow,
title={Flow-Disentangled Feature Importance},
author={Chen, Xin and Guo, Yifan and Du, Jin-Hong},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2026}
}
Contact
For questions and issues, please use the GitHub issue tracker.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fdfi-0.0.2.tar.gz.
File metadata
- Download URL: fdfi-0.0.2.tar.gz
- Upload date:
- Size: 38.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
892f243cb8f3edf39a54f3caad00199628b157e64bdbd134ebb89b5dfa4e05a3
|
|
| MD5 |
b1784623ecbb85d94d2e4d0eb4432255
|
|
| BLAKE2b-256 |
b6e5cc8053c0b3aa00e5065ab0882f98c2503177a16610242cff8f5f8ea8a01c
|
Provenance
The following attestation bundles were made for fdfi-0.0.2.tar.gz:
Publisher:
publish.yml on jaydu1/FDFI
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fdfi-0.0.2.tar.gz -
Subject digest:
892f243cb8f3edf39a54f3caad00199628b157e64bdbd134ebb89b5dfa4e05a3 - Sigstore transparency entry: 961114942
- Sigstore integration time:
-
Permalink:
jaydu1/FDFI@ce7880dc44ccb465cedd0370d455bff26533cc19 -
Branch / Tag:
refs/tags/0.0.2 - Owner: https://github.com/jaydu1
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ce7880dc44ccb465cedd0370d455bff26533cc19 -
Trigger Event:
release
-
Statement type:
File details
Details for the file fdfi-0.0.2-py3-none-any.whl.
File metadata
- Download URL: fdfi-0.0.2-py3-none-any.whl
- Upload date:
- Size: 28.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e5fbce1d72759dbf2770f82437f5929e2892bee5d00b4aa544f8ffeaffb97a52
|
|
| MD5 |
25836eae49b5ac36c0d91462ff3d0692
|
|
| BLAKE2b-256 |
ba034ae13c1e9cc95261ec28a6211c7177448f8cd5ad402cf968d25849205624
|
Provenance
The following attestation bundles were made for fdfi-0.0.2-py3-none-any.whl:
Publisher:
publish.yml on jaydu1/FDFI
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fdfi-0.0.2-py3-none-any.whl -
Subject digest:
e5fbce1d72759dbf2770f82437f5929e2892bee5d00b4aa544f8ffeaffb97a52 - Sigstore transparency entry: 961114980
- Sigstore integration time:
-
Permalink:
jaydu1/FDFI@ce7880dc44ccb465cedd0370d455bff26533cc19 -
Branch / Tag:
refs/tags/0.0.2 - Owner: https://github.com/jaydu1
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ce7880dc44ccb465cedd0370d455bff26533cc19 -
Trigger Event:
release
-
Statement type: