FL-Launching (Fling) is a research platform for Federated Learning using PyTorch as backend.
Its goal is to simulate the distributed learning process of Federated Learning on single or multiple machines, providing a fair testing platform for the performance of various federated learning algorithms on different datasets. It is mainly based on the Python language and uses the PyTorch framework as the backend module for deep learning, supporting a variety of federated learning algorithms and commonly used federated learning datasets.
It mainly supports:
- Generic Federated Learning methods, such as FedAvg.
- Personalized Federated Learning methods, such as FedPer.
- Attacking methods, such as DLG.
Firstly, it is recommended to install PyTorch manually with a suitable version (specifically 1.1.0 or higher). However, using PyTorch version 2.0.0 or later is preferred due to its better computational efficiency. Instructions for installation can be found at this link.
After the first step, you can simply install the latest version of Fling with the following command by using Git:
git clone https://github.com/FLAIR-Community/Fling
cd Fling
pip install -e .
Finally, you can use
fling -v
to check whether Fling is successfully installed.
After successfully install Fling, users can start the first Fling experiment by using the following command. An example for generic federated learning:
python flzoo/mnist/mnist_fedavg_cnn_toy_config.py
Or using our cli util by:
fling run -c flzoo/mnist/mnist_fedper_cnn_toy_config.py -p personalized_model_pipeline
This config is a simplified version for conducting FedAvg on the dataset MNIST and iterate for 4 communication rounds.
For other algorithms and datasets, users can refer to argzoo/
or customize your own configuration files.
For visualization utilities, please refer to README for visualization.
For attacking methods, please refer to our examples in: demo for attack
Tutorials:
Overall Framework of Fling | Fling 整体框架
Meaning for Each Configuration Key | 配置文件各字段含义
How to Add a New FL Algorithm | 如何自定义联邦学习算法
How to Add a New Dataset | 如何添加新数据集
Cli Usage in Fling | Fling 的 CLI 使用
- Support for a variety of algorithms and datasets.
- Support multiprocessing training on each client for better efficiency.
- Using single GPU to simulate Federated Learning process (multi-GPU version will be released soon).
- Strong visualization utilities. See README for detailed information. There are also demos for reference.
FedAvg: Communication-Efficient Learning of Deep Networks from Decentralized Data
FedProx: Federated Optimization in Heterogeneous Networks
FedMOON: Model-Contrastive Federated Learning
SCAFFOLD: SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
FedPart: Why Go Full? Elevating Federated Learning Through Partial Network Updates
FedPer: Federated Learning with Personalization Layers
FedBN: FedBN: Federated Learning on Non-IID Features via Local Batch Normalization
FedRoD: On Bridging Generic and Personalized Federated Learning for Image Classification
pFedSD: Personalized Edge Intelligence via Federated Self-Knowledge Distillation
DLG: Deep Leakage from Gradients
iDLG: Inverting Gradients -- How easy is it to break privacy in federated learning?
- For any bugs, questions, feature requirements, feel free to propose them in issues
- For any contributions that can improve Fling (more algorithms or better system design), we warmly welcome you to propose them in a pull request.
Special thanks to @kxzxvbk, @chuchugloria, @shonnyx, @XinghaoWu,
@misc{Fling,
title={Fling: Framework for Federated Learning},
author={Fling Contributors},
publisher = {GitHub},
howpublished = {\url{https://github.com/FLAIR-Community/Fling}},
year={2023},
}
Fling is released under the Apache 2.0 license.