NEBULA: A Platform for Decentralized Federated Learning
Project description
NEBULA: A Platform for Decentralized Federated Learning
🌌 About NEBULA
NEBULA (previously known as Fedstellar[^1]) is a cutting-edge platform designed to facilitate the training of federated models within both centralized and decentralized architectures. It streamlines the development, deployment, and management of federated applications across physical and virtualized devices.
NEBULA is developed by Enrique Tomás Martínez Beltrán in collaboration with the University of Murcia, armasuisse, and the University of Zurich.
🚀 Key Components
NEBULA boasts a modular architecture that consists of three core elements:
- Frontend: A user-friendly interface for setting up experiments and monitoring progress.
- Controller: An orchestrator that ensures efficient operation management.
- Core: The fundamental component deployed on each device to handle federated learning processes.
🌟 Main Features
- Decentralized: Train models without a central server, leveraging decentralized federated learning.
- Privacy-preserving: Maintain data privacy by training on-device and only sharing model updates.
- Topology-agnostic: Support for various network topologies including star, ring, and mesh.
- Model-agnostic: Compatible with a wide range of machine learning algorithms, from deep learning to traditional methods.
- Network communication: Secure and efficient device communication with features like compression, network failure tolerance, and condition simulation.
- Trustworthiness: Ensure the integrity of the learning process by verifying the reliability of the federation.
- Blockchain integration: Support for blockchain technologies to enhance security and transparency.
- Security: Implement security mechanisms to protect the learning process from adversarial attacks.
- Real-time monitoring: Provides live performance metrics and visualizations during the learning process.
🌍 Scenario Applications
- 🏥 Healthcare: Train models on medical devices such as wearables, smartphones, and sensors.
- 🏭 Industry 4.0: Implement on industrial devices like robots, drones, and constrained devices.
- 📱 Mobile services: Optimize for mobile devices including smartphones, tablets, and laptops.
- 🛡️ Military: Apply to military equipment such as drones, robots, and sensors.
- 🚗 Vehicular scenarios: Utilize in vehicles including cars, trucks, and drones.
[^1]: Fedstellar was our first version of the platform. We have redesigned the previous functionalities and added new capabilities based on our research. The platform is now called NEBULA and is available as an open-source project.
🎯 Get Started
To start using NEBULA, follow our detailed Installation Guide and User Manual. For any queries or contributions, check out our Contribution Guide.
🤝 Contributing
We welcome contributions from the community to enhance NEBULA. If you are interested in contributing, please follow the next steps:
- Fork the repository
- Create a new branch with your feature or bug fix (
git checkout -b feature/your-feature
). - Commit your changes (
git commit -am 'Add new feature'
). - Push to the branch (
git push origin feature/your-feature
). - Create a new Pull Request.
📚 Citation
If you use NEBULA (or Fedstellar) in a scientific publication, we would appreciate using the following citations:
@article{MartinezBeltran:DFL:2023,
title = {{Decentralized Federated Learning: Fundamentals, State of the Art, Frameworks, Trends, and Challenges}},
author = {Mart{\'i}nez Beltr{\'a}n, Enrique Tom{\'a}s and Quiles P{\'e}rez, Mario and S{\'a}nchez S{\'a}nchez, Pedro Miguel and L{\'o}pez Bernal, Sergio and Bovet, G{\'e}r{\^o}me and Gil P{\'e}rez, Manuel and Mart{\'i}nez P{\'e}rez, Gregorio and Huertas Celdr{\'a}n, Alberto},
year = 2023,
volume = {25},
number = {4},
pages = {2983-3013},
journal = {IEEE Communications Surveys & Tutorials},
doi = {10.1109/COMST.2023.3315746},
preprint = {https://arxiv.org/abs/2211.08413}
}
@article{MartinezBeltran:fedstellar:2024,
title = {{Fedstellar: A Platform for Decentralized Federated Learning}},
author = {Mart{\'i}nez Beltr{\'a}n, Enrique Tom{\'a}s and Perales G{\'o}mez, {\'A}ngel Luis and Feng, Chao and S{\'a}nchez S{\'a}nchez, Pedro Miguel and L{\'o}pez Bernal, Sergio and Bovet, G{\'e}r{\^o}me and Gil P{\'e}rez, Manuel and Mart{\'i}nez P{\'e}rez, Gregorio and Huertas Celdr{\'a}n, Alberto},
year = 2024,
volume = {242},
issn = {0957-4174},
pages = {122861},
journal = {Expert Systems with Applications},
doi = {10.1016/j.eswa.2023.122861},
preprint = {https://arxiv.org/abs/2306.09750}
}
@inproceedings{MartinezBeltran:fedstellar_demo:2023,
title = {{Fedstellar: A Platform for Training Models in a Privacy-preserving and Decentralized Fashion}},
author = {Mart{\'i}nez Beltr{\'a}n, Enrique Tom{\'a}s and S{\'a}nchez S{\'a}nchez, Pedro Miguel and L{\'o}pez Bernal, Sergio and Bovet, G{\'e}r{\^o}me and Gil P{\'e}rez, Manuel and Mart{\'i}nez P{\'e}rez, Gregorio and Huertas Celdr{\'a}n, Alberto},
year = 2023,
month = aug,
booktitle = {Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, {IJCAI-23}},
publisher = {International Joint Conferences on Artificial Intelligence Organization},
pages = {7154--7157},
doi = {10.24963/ijcai.2023/838},
note = {Demo Track},
editor = {Edith Elkind}
}
@article{MartinezBeltran:DFL_mitigating_threats:2023,
title = {{Mitigating Communications Threats in Decentralized Federated Learning through Moving Target Defense}},
author = {Mart{\'i}nez Beltr{\'a}n, Enrique Tom{\'a}s and S{\'a}nchez S{\'a}nchez, Pedro Miguel and L{\'o}pez Bernal, Sergio and Bovet, G{\'e}r{\^o}me and Gil P{\'e}rez, Manuel and Mart{\'i}nez P{\'e}rez, Gregorio and Huertas Celdr{\'a}n, Alberto},
year = 2024,
journal = {Wireless Networks},
doi = {10.1007/s11276-024-03667-8}
preprint = {https://arxiv.org/abs/2307.11730}
}
📝 License
Distributed under the GNU GPLv3 License. See LICENSE
for more information.
🙏 Acknowledgements
We would like to thank the following projects for their contributions which have helped shape NEBULA:
- PyTorch Lightning for the training loop and model management
- Tensorboard and Aim for the visualization tools and monitoring capabilities
- Different datasets (nebula/core/datasets) and models (nebula/core/models) for testing and validation purposes
- FastAPI for the RESTful API
- Web3 for the blockchain integration
- Fedstellar platform and p2pfl library
- Adversarial Robustness Toolbox (ART) for the implementation of adversarial attacks
- D3.js for the network visualizations
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nebula_dfl-0.0.1.tar.gz
.
File metadata
- Download URL: nebula_dfl-0.0.1.tar.gz
- Upload date:
- Size: 8.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.7 Darwin/24.1.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 18a157159fb5f9b5c97a0409009f1976e8dd1a81181f04cccbfea24e6dae0eb9 |
|
MD5 | 2e1945a1f5c39fdaf3352f0fbd3fd86f |
|
BLAKE2b-256 | a709d440f1d7057f457518f3abb5fa5d1ba3ccaf45ac9bae928c5336e9cadd93 |
File details
Details for the file nebula_dfl-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: nebula_dfl-0.0.1-py3-none-any.whl
- Upload date:
- Size: 8.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.11.7 Darwin/24.1.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 981c6ad0af6b8888cc25f67c36c97e17a0d10d77dc7af045f4350b90a0294ff1 |
|
MD5 | 5efcb15d9d3af0a1d297d7452b27367e |
|
BLAKE2b-256 | c25c72e21af22469641bacd9d2ef9bf79d7d45b7b5af0c928b2a0f9519170613 |