Skip to content

Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift [NeurIPS 2024]

License

Notifications You must be signed in to change notification settings

Chen-Junbao/FedCCFA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FedCCFA: Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift

This is the implementation of our paper: Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift (NeurIPS 2024). In this paper, we propose a federated learning framework with classifier clustering and feature alignment (FedCCFA).

FedCCFA

📦 Algorithms

This repository also provides the implementations of the following methods:

Traditional Federated Learning

Personalized Federated Learning

Clustered Federated Learning

Concept Drift in Federated Learning

📄 Experiments

Hyperparameters

All hyperparameters can be set in ./configs/*.yaml. Common hyperparameters in our experiments are list as follows:

Hyperparameter Values Description
client_num 20 or 100 20 with full participation or 100 with 20% participation
sample_ratio 1 or 0.2 full participation or 20% participation
dataset Fashion-MNIST or CIFAR-10 or CINIC-10 Three datasets in our experiments
alpha 0.1 or 0.5 concentration parameter in Dirichlet distribution
drift_pattern false / sudden / incremental / recurrent concept drift pattern

The above numerical values (for client_num, sample_raio and alpha) are used in our experiments. You can select any value you want.

For the descriptions of other hyperparameters, please refer to our paper for more details.

Datasets

Three datasets (Fashion-MNIST, CIFAR-10 and CINIC-10) are used in our experiments. You can download these datasets and put them in ./data.

For other datasets, you can download them and implement data distribution in ./utils/gen_dataset.py.

Example for FedCCFA under sudden drift setting

Edit configs/FedCCFA.yaml (e.g., set "drift_pattern" to "sudden").

cd methods
export PYTHONPATH=../
python3 FedCCFA.py

💡 Other Implementation

You can easily develop other FL algorithms by this repository.

First, create a new python script: entities/XXX.py

from entities.base import Client, Server

Class XXXClient(Client):
    def __init__(self, client_id, args, train_set, test_set, global_test_id):
        super().__init__(client_id, args, train_set, test_set, global_test_id)
    
    def train(self):
        """New local training method if needed"""
        pass


Class XXXServer(Server):
    def __init__(self, args):
        super().__init__(args)
    
    def aggregate_by_params(self, clients):
        """New aggregation method if needed"""
        pass

Then, create a new python script: methods/XXX.py. Please refer to methods/FedAvg.py to implement new FL process.

🧑🏻‍💻 Bugs or Questions?

If you have any questions related to the code or the paper, feel free to email Junbao ([email protected]). If you encounter any problems when using the code, or want to report a bug, you can open an issue.

📝 Citation

If you find FedCCFA useful for your research, please consider citing our paper:

@article{chen2024classifier,
    title={Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift},
    author={Chen, Junbao and Xue, Jingfeng and Wang, Yong and Liu, Zhenyan and Huang, Lu},
    booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
    year={2024}
}

About

Classifier Clustering and Feature Alignment for Federated Learning under Distributed Concept Drift [NeurIPS 2024]

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages