Skip to main content

A library of Fed-XAI algorithms

Project description

fedxai-lib

License

fedxai-lib is a Python library for Federated Learning (FL) of eXplainable Artificial Intelligence (XAI) models. The library provides privacy-preserving implementations of interpretable machine learning algorithms, enabling distributed training while maintaining data privacy and model transparency.

The current version of the framework includes the implementation of federated clustering algorithms (Fuzzy C-Means and C-Means for both horizontal and vertical data partitioning) [3], a federated Fuzzy Regression Tree (FRT) algorithm for interpretable regression tasks [2], a federated Rule-Based Classifier (FRBC) for explainable classification [4], and a federated SHAP implementation for consistent post-hoc explainability [5]. These algorithms are designed to operate in distributed environments where data cannot be centralized due to privacy, regulatory, or operational constraints.

This work has been developed by the Artificial Intelligence R&D Group at the Department of Information Engineering, University of Pisa. fedxai-lib has supported research, development, and demonstration activities concerning the FL of XAI models.

Department of Information Engineering    FAIR Project


Table of Contents


Repository Structure

fedxai-lib/
├── src/
│   ├── fedxai_lib/                     # Main library package
│   │   ├── algorithms/                 # Federated algorithm implementations
│   │   │   ├── federated_fcmeans_horizontal/  # Fuzzy C-Means (Horizontal)
│   │   │   ├── federated_fcmeans_vertical/    # Fuzzy C-Means (Vertical)
│   │   │   ├── federated_cmeans_horizontal/   # C-Means (Horizontal)
│   │   │   ├── federated_cmeans_vertical/     # C-Means (Vertical)
│   │   │   ├── federated_frt/                 # Fuzzy Regression Tree
│   │   │   ├── federated_frbc/                # Rule-Based Classifier
│   │   │   └── federated_shap/                # Federated SHAP
│   │   ├── descriptors/                # Federated learning plan descriptors
│   │   │   └── definitions/            # JSON-based execution plans
│   │   └── __init__.py                 # Public API
│   ├── tests/                          # Unit tests and local simulations
│   └── scripts/                        # Federation execution scripts
│       ├── run_federation.sh           # Shell script to execute federations
│       └── executions/                 # Example federation configurations
├── datasets/                           # Raw datasets
├── datasets_splits/                    # Dataset partitions for clients
├── models/                             # Trained model outputs
├── dist/                               # Built package distributions
├── Dockerfile.fedxai_lib               # Docker image for clients/director
├── Dockerfile.requester                # Docker image for requester node
├── docker-compose-director.yml         # Director service configuration
├── docker-compose-clients.yml          # Client services configuration
├── docker-compose-requester.yml        # Requester service configuration
├── pyproject.toml                      # Poetry dependencies and metadata
├── requirements.txt                    # pip-compatible requirements
├── LICENSE                             # Apache 2.0 license
└── README.md                           # This file

Prerequisites

Before using fedxai-lib, ensure the following dependencies are installed:

Python Package Dependencies

The library depends on the following key packages (automatically installed via Poetry):

  • fedlang-py == 0.0.1
  • pandas >= 2.3.3
  • numpy >= 2.3.4
  • numba >= 0.62.1
  • scikit-learn >= 1.7.2
  • simpful >= 2.12.0
  • shap >= 0.46.0

Installation

pip install fedxai-lib

Usage

fedxai-lib supports two execution modes: local federation (for testing and development) and distributed federation (using Docker containers).

Local Federation Execution

Local execution simulates federated learning by running all clients and server in a single process. This is ideal for algorithm testing and debugging.

Example: Federated Fuzzy Regression Tree

from fedxai_lib import run_fedxai_experiment, FedXAIAlgorithm
from fedxai_lib.algorithms.federated_frt.client import FedFRTClient
from fedxai_lib.algorithms.federated_frt.server import FedFRTServer

# Define algorithm parameters
parameters = {
    "gain_threshold": 0.0001,
    "max_number_rounds": 100,
    "num_fuzzy_sets": 5,
    "max_depth": None,
    "min_samples_split_ratio": 0.1,
    "min_num_clients": 20,
    "obfuscate": True,
    "features_names": ["Max_temperature", "Min_temperature", "Dewpoint",
                       "Precipitation", "Sea_level_pressure", "Standard_pressure",
                       "Visibility", "Wind_speed", "Max_wind_speed"],
    "target": "Mean_temperature",
    "dataset_X_train": "/dataset/X_train.csv",
    "dataset_y_train": "/dataset/y_train.csv",
    "dataset_X_test": "/dataset/X_test.csv",
    "dataset_y_test": "/dataset/y_test.csv",
    "model_output_file": "/models/frt_weather_izimir.pickle"
}

# Create clients with their local data partitions
clients = [
    FedFRTClient(type='client', id=idx,
                 scaler_X=scaler_x, scaler_y=scaler_y,
                 X_train=dataset_by_client[idx]['X_train'],
                 y_train=dataset_by_client[idx]['y_train'],
                 X_test=dataset_by_client[idx]['X_test'],
                 y_test=dataset_by_client[idx]['y_test'])
    for idx in range(num_clients)
]

# Create server
server = FedFRTServer(type='server')

# Run federated experiment
run_fedxai_experiment(FedXAIAlgorithm.FED_FRT_HORIZONTAL, server, clients, parameters)

Running test scripts:

cd src
poetry run python tests/test_fed_frt_weather_izimir.py
poetry run python tests/test_fed_fcmeans_horizontal_xclara.py
poetry run python tests/test_fed_rbc_rmi_demo_fedxai_lib.py
poetry run python tests/test_fed_shap_rmi.py

Additional examples for all implemented algorithms can be found in the src/tests/ directory.

Docker-Based Distributed Federation

For realistic federated scenarios with Docker containers distributed across multiple machines, please refer to the comprehensive Illustrative Example which provides:

  • Step-by-step Docker infrastructure setup
  • Environment configuration guidelines
  • Federation execution and monitoring
  • Troubleshooting common issues

Quick Start:

# Build Docker images
docker build --progress=plain -f Dockerfile.fedxai_lib -t fedxai .
docker build --progress=plain -f Dockerfile.requester -t fedlang-requester .

# Launch federation infrastructure
# IMPORTANT: Start the director first, then the clients
docker compose -f docker-compose-director.yml up -d  # On director machine (start first)
docker compose -f docker-compose-clients.yml up -d   # On client machines (start after director)
docker compose -f docker-compose-requester.yml up -d  # On requester machine

# Execute federation
docker exec -it requester /bin/bash
cd scripts
./run_federation.sh ./executions/federated_frt_weather_izimir.json

For detailed instructions, configuration examples, and troubleshooting, see Illustrative_Example.md.


Algorithm Hyperparameters

Detailed documentation of all hyperparameters for each implemented algorithm is available in:

Algorithm_Hyperparameters.md

This reference provides:

  • Complete hyperparameter descriptions for all algorithms
  • Parameter types, default values, and valid ranges
  • Usage examples and best practices
  • Privacy-related parameter configurations

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

By contributing to this project, you agree that your contributions will be licensed under the Apache License 2.0.


Contributors


Citations

If you use fedxai-lib in your research, please cite the relevant papers:

[1] Middleware Support for Federated Learning

@article{bechini2025devising,
  title={Devising an actor-based middleware support to federated learning experiments and systems},
  author={Bechini, Alessio and Corcuera Barcena, Jose Luis},
  journal={Future Generation Computer Systems},
  volume={166},
  pages={107646},
  year={2025},
  publisher={Elsevier},
  doi={10.1016/j.future.2024.107646}
}

[2] Federated Fuzzy Regression Trees

@article{barcena2025increasing,
  title={Increasing trust in AI through privacy preservation and model explainability: Federated Learning of Fuzzy Regression Trees},
  author={B{\'a}rcena, Jos{\'e} Luis Corcuera and Ducange, Pietro and Marcelloni, Francesco and Renda, Alessandro},
  journal={Information Fusion},
  volume={113},
  pages={102598},
  year={2025},
  publisher={Elsevier},
  doi={10.1016/j.inffus.2024.102598}
}

[3] Federated C-Means and Fuzzy C-Means Clustering

@article{federated_cmeans,
  title={Federated C-Means and Fuzzy C-Means Clustering Algorithms for Horizontally and Vertically Partitioned Data},
  author={[Authors to be specified]},
  journal={[To be published]},
  year={2025}
}

[4] Federated Rule-Based Classifier (FRBC)

@inproceedings{daole2024trustworthy,
  title={Trustworthy AI in heterogeneous settings: federated learning of explainable classifiers},
  author={Daole, M. and Ducange, P. and Marcelloni, F. and Renda, A.},
  booktitle={2024 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE)},
  pages={1--9},
  year={2024},
  organization={IEEE}
}

[5] Federated SHAP: Consistent Post-hoc Explainability

@inproceedings{ducange2024consistent,
  title={Consistent post-hoc explainability in federated learning through federated fuzzy clustering},
  author={Ducange, Pietro and Marcelloni, Francesco and Renda, Alessandro and Ruffini, Fabrizio},
  booktitle={2024 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE)},
  pages={1--10},
  year={2024},
  organization={IEEE}
}

Acknowledgments

This work has been developed by the Artificial Intelligence R&D Group at the Department of Information Engineering, University of Pisa.

fedxai-lib has supported research, development, and demonstration activities concerning the FL of XAI models. This work has been funded by:

  • Bando FAIR Trasferimento Tecnologico: "Sviluppo di una libreria modulare per l'apprendimento di modelli di Explainable Artificial Intelligence in ambienti di Federated Learning" (Development of a modular library for learning Explainable Artificial Intelligence models in Federated Learning environments)

We would like to acknowledge the invaluable support of Professors Nicola Tonellotto, Tiberio Uricchio, and Alberto Landi, members of the Fed-XAI Library project working group.


For questions, issues, or contributions, please visit the GitHub repository or contact the contributors directly.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fedxai_lib-1.0.2.tar.gz (44.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fedxai_lib-1.0.2-py3-none-any.whl (65.4 kB view details)

Uploaded Python 3

File details

Details for the file fedxai_lib-1.0.2.tar.gz.

File metadata

  • Download URL: fedxai_lib-1.0.2.tar.gz
  • Upload date:
  • Size: 44.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.11.12 Linux/6.8.0-87-generic

File hashes

Hashes for fedxai_lib-1.0.2.tar.gz
Algorithm Hash digest
SHA256 a990b0abb3df605a25dc2802b93a420ffbe893479131b14f36b8d18fddc489bf
MD5 b06937472ef10e63e3ce31d588d13cab
BLAKE2b-256 084d4b6afb6fd1fc5789ad606d5db345832fc1bca27f3d68bbeb5587fa4aca5c

See more details on using hashes here.

File details

Details for the file fedxai_lib-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: fedxai_lib-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 65.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.11.12 Linux/6.8.0-87-generic

File hashes

Hashes for fedxai_lib-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 28cd9c349ceeee97ed7b58f695020c4ee58c0b5689b72deb8dbd43bc1de0860e
MD5 9f16279a7137f71d3a034f10fa06a4fb
BLAKE2b-256 7c0e85e097fddc3a4f644c691a0f40ab1d59126301c07f2f30092dca236a3b31

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page