A Federated Learning research library - FedML: https://fedml.ai
A curated list of federated learning publications, re-organized from Arxiv (mostly).
Last Update: October, 15th, 2020.
If your publication is not included here, please email to [email protected]
We are thrilled to share that Advances and Open Problems in Federated Learning has been accepted to FnTML (Foundations and Trends in Machine Learning, the chief editor is Michael Jordan).
Title | Team/Authors | Venue and Year | Targeting Problem | Method |
---|---|---|---|---|
Federated Learning with Only Positive Labels | Google Research | ICML 2020 | label deficiency in multi-class classification | regularization |
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning | EPFL, Google Research | ICML 2020 | heterogeneous data (non-I.I.D) | nonconvex/convex optimization with variance reduction |
FedBoost: A Communication-Efficient Algorithm for Federated Learning | Google Research, NYU | ICML 2020 | communication cost | ensemble algorithm |
FetchSGD: Communication-Efficient Federated Learning with Sketching | UC Berkeley, JHU, Amazon | ICML 2020 | communication cost | compress model updates with Count Sketch |
From Local SGD to Local Fixed-Point Methods for Federated Learning | KAUST | ICML 2020 | communication cost | Optimization |
Title | Team/Authors | Venue and Year | Targeting Problem | Method |
---|---|---|---|---|
Lower Bounds and Optimal Algorithms for Personalized Federated Learning | KAUST | NeurIPS 2020 | non-I.I.D, personalization | |
Personalized Federated Learning with Moreau Envelopes | The University of Sydney | NeurIPS 2020 | non-I.I.D, personalization | |
Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach | MIT | NeurIPS 2020 | non-I.I.D, personalization | |
Differentially-Private Federated Contextual Bandits | MIT | NeurIPS 2020 | Contextual Bandits | |
Federated Principal Component Analysis | Cambridge | NeurIPS 2020 | PCA | |
FedSplit: an algorithmic framework for fast federated optimization | UCB | NeurIPS 2020 | Acceleration | |
Federated Bayesian Optimization via Thompson Sampling | MIT | NeurIPS 2020 | ||
Robust Federated Learning: The Case of Affine Distribution Shifts | MIT | NeurIPS 2020 | Privacy, Robustness | |
An Efficient Framework for Clustered Federated Learning | UCB | NeurIPS 2020 | heterogeneous data (non-I.I.D) | |
Distributionally Robust Federated Averaging | PSU | NeurIPS 2020 | Privacy, Robustness | |
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge | USC | NeurIPS 2020 | Efficient Training of Large DNN at Edge | |
A Scalable Approach for Privacy-Preserving Collaborative Machine Learning | USC | NeurIPS 2020 | Scalability | |
Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization | CMU | NeurIPS 2020 | local update step heterogeneity | |
Attack of the Tails: Yes, You Really Can Backdoor Federated Learning | Wiscosin | NeurIPS 2020 | Privacy, Robustness | |
Federated Accelerated Stochastic Gradient Descent | Stanford | NeurIPS 2020 | Acceleration | |
Inverting Gradients - How easy is it to break privacy in federated learning? | University of Siegen | NeurIPS 2020 | Privacy, Robustness | |
Ensemble Distillation for Robust Model Fusion in Federated Learning | EPFL | NeurIPS 2020 | Privacy, Robustness | |
Optimal Topology Design for Cross-Silo Federated Learning | Inria | NeurIPS 2020 | Topology Optimization | |
Distributed Training with Heterogeneous Data: Bridging Median- and Mean-Based Algorithms | University of Minnesota | NeurIPS 2020 | ||
Distributed Distillation for On-Device Learning | Stanford | NeurIPS 2020 | ||
Byzantine Resilient Distributed Multi-Task Learning | Vanderbilt University | NeurIPS 2020 | ||
Distributed Newton Can Communicate Less and Resist Byzantine Workers | UCB | NeurIPS 2020 | ||
Minibatch vs Local SGD for Heterogeneous Distributed Learning | TTIC | NeurIPS 2020 | ||
Election Coding for Distributed Learning: Protecting SignSGD against Byzantine Attacks | NeurIPS 2020 |
(according to https://neurips.cc/Conferences/2020/AcceptedPapersInitial)
Note: most of the accepted publications are preparing the camera ready revision, thus we are not sure the detail of their proposed methods
- Distributed Optimization
- Non-IID and Model Personalization
- Semi-Supervised Learning
- Vertical Federated Learning
- Decentralized FL
- Hierarchical FL
- Neural Architecture Search
- Transfer Learning
- Continual Learning
- Domain Adaptation
- Reinforcement Learning
- Bayesian Learning
System Challenges: communication and computational resource constrained, software and hardware heterogeneity, and FL system (141)
- Communication-Efficiency
- Straggler Problem
- Computation Efficiency
- Wireless Communication and Cloud Computing
- FL System Design
- Models
- Natural language Processing
- Computer Vision
- Health Care
- Transportation
- Recommendation System
- Speech
- Finance
- Smart City
- Robotics
- Networking
- Blockchain
- Other
- Benchmark and Dataset (7)
- Survey (20)
Userful Federated Optimizer Baselines:
FedAvg: Communication-Efficient Learning of Deep Networks from Decentralized Data. 2016-02. AISTAT 2017.
FedOpt: Adaptive Federated Optimization. ICLR 2021 (Under Review). 2020-02-29
FedNov: Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization. NeurIPS 2020
Federated Optimization: Distributed Optimization Beyond the Datacenter. NIPS 2016 workshop.
Federated Optimization: Distributed Machine Learning for On-Device Intelligence
Stochastic, Distributed and Federated Optimization for Machine Learning. FL PhD Thesis. By Jakub
Collaborative Deep Learning in Fixed Topology Networks
LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning
Proxy Experience Replay: Federated Distillation for Distributed Reinforcement Learning
Exact Support Recovery in Federated Regression with One-shot Communication
DEED: A General Quantization Scheme for Communication Efficiency in Bits Researcher: Ruoyu Sun, UIUC
Robust Federated Learning: The Case of Affine Distribution Shifts
Personalized Federated Learning with Moreau Envelopes
Towards Flexible Device Participation in Federated Learning for Non-IID Data Keywords: inactive or return incomplete updates in non-IID dataset
A Primal-Dual SGD Algorithm for Distributed Nonconvex Optimization
FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data Researcher: Wotao Yin, UCLA
FedSplit: An algorithmic framework for fast federated optimization
Distributed Stochastic Non-Convex Optimization: Momentum-Based Variance Reduction
On the Outsized Importance of Learning Rates in Local Update Methods Highlight: local model learning rate optimization + automation Researcher: Jakub
Federated Learning with Compression: Unified Analysis and Sharp Guarantees Highlight: non-IID, gradient compression + local SGD Researcher: Mehrdad Mahdavi, Jin Rong’s PhD Student http://www.cse.psu.edu/~mzm616/
From Local SGD to Local Fixed-Point Methods for Federated Learning
Federated Residual Learning. 2020-03
Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization. ICML 2020.
LASG: Lazily Aggregated Stochastic Gradients for Communication-Efficient Distributed Learning
Distributed Optimization over Block-Cyclic Data
Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability
Federated Learning with Matched Averaging
Federated Learning of a Mixture of Global and Local Models
Faster On-Device Training Using New Federated Momentum Algorithm
FedDANE: A Federated Newton-Type Method
Distributed Fixed Point Methods with Compressed Iterates
Primal-dual methods for large-scale and distributed convex optimization and data analytics
Representation of Federated Learning via Worst-Case Robust Optimization Theory
On the Convergence of Local Descent Methods in Federated Learning
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
Central Server Free Federated Learning over Single-sided Trust Social Networks
Accelerating Federated Learning via Momentum Gradient Descent
Gradient Descent with Compressed Iterates
First Analysis of Local GD on Heterogeneous Data
(*) On the Convergence of FedAvg on Non-IID Data. ICLR 2020.
Robust Federated Learning in a Heterogeneous Environment
Scalable and Differentially Private Distributed Aggregation in the Shuffled Model
Variational Federated Multi-Task Learning
Bayesian Nonparametric Federated Learning of Neural Networks. ICLR 2019.
Differentially Private Learning with Adaptive Clipping
Semi-Cyclic Stochastic Gradient Descent
Asynchronous Federated Optimization
Federated Optimization in Heterogeneous Networks
Partitioned Variational Inference: A unified framework encompassing federated and continual learning
Learning Rate Adaptation for Federated and Differentially Private Learning
Communication-Efficient Robust Federated Learning Over Heterogeneous Datasets
An Efficient Framework for Clustered Federated Learning
Adaptive Federated Learning in Resource Constrained Edge Computing Systems Citation: 146
Adaptive Federated Optimization
Local SGD converges fast and communicates little
Don’t Use Large Mini-Batches, Use Local SGD
Overlap Local-SGD: An Algorithmic Approach to Hide Communication Delays in Distributed SGD
Local SGD With a Communication Overhead Depending Only on the Number of Workers
Federated Accelerated Stochastic Gradient Descent
Tighter Theory for Local SGD on Identical and Heterogeneous Data
STL-SGD: Speeding Up Local SGD with Stagewise Communication Period
Don't Use Large Mini-Batches, Use Local SGD
Understanding Unintended Memorization in Federated Learning
The Non-IID Data Quagmire of Decentralized Machine Learning. 2019-10
Federated Learning with Non-IID Data
FedCD: Improving Performance in non-IID Federated Learning. 2020
Life Long Learning: FedFMC: Sequential Efficient Federated Learning on Non-iid Data. 2020
Robust Federated Learning: The Case of Affine Distribution Shifts. 2020
Personalized Federated Learning with Moreau Envelopes. 2020
Ensemble Distillation for Robust Model Fusion in Federated Learning. 2020 Researcher: Tao Lin, ZJU, EPFL https://tlin-tao-lin.github.io/index.html
Proxy Experience Replay: Federated Distillation for Distributed Reinforcement Learning. 2020
Towards Flexible Device Participation in Federated Learning for Non-IID Data. 2020 Keywords: inactive or return incomplete updates in non-IID dataset
XOR Mixup: Privacy-Preserving Data Augmentation for One-Shot Federated Learning. 2020
NeurIPS 2020 submission: An Efficient Framework for Clustered Federated Learning. 2020 Researcher: AVISHEK GHOSH, UCB, PhD
Continual Local Training for Better Initialization of Federated Models. 2020
FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data. 2020 Researcher: Wotao Yin, UCLA
Global Multiclass Classification from Heterogeneous Local Models. 2020 Researcher: Stanford https://stanford.edu/~pilanci/
Multi-Center Federated Learning. 2020
Federated Learning with Only Positive Labels. 2020 Researcher: Felix Xinnan Yu, Google New York Keywords: positive labels Limited Labels
Federated Semi-Supervised Learning with Inter-Client Consistency. 2020
(*) Adaptive Personalized Federated Learning
Survey of Personalization Techniques for Federated Learning. 2020-03-19
Device Heterogeneity in Federated Learning: A Superquantile Approach. 2020-02
Personalized Federated Learning for Intelligent IoT Applications: A Cloud-Edge based Framework
Three Approaches for Personalization with Applications to Federated Learning
Personalized Federated Learning: A Meta-Learning Approach
Towards Federated Learning: Robustness Analytics to Data Heterogeneity Highlight: non-IID + adversarial attacks
Salvaging Federated Learning by Local Adaptation Highlight: an experimental paper that evaluate FL can help to improve the local accuracy
FOCUS: Dealing with Label Quality Disparity in Federated Learning. 2020-01
Overcoming Noisy and Irrelevant Data in Federated Learning. ICPR 2020.
Federated Learning with Personalization Layers
Federated Adversarial Domain Adaptation
Federated Evaluation of On-device Personalization
Federated Learning with Unbiased Gradient Aggregation and Controllable Meta Updating
Overcoming Forgetting in Federated Learning on Non-IID Data
Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data
Improving Federated Learning Personalization via Model Agnostic Meta Learning
Measure Contribution of Participants in Federated Learning
(*) Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification
Multi-hop Federated Private Data Augmentation with Sample Compression
Distributed Training with Heterogeneous Data: Bridging Median- and Mean-Based Algorithms
Hybrid-FL for Wireless Networks: Cooperative Learning Mechanism Using Non-IID Data
Robust and Communication-Efficient Federated Learning from Non-IID Data
Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge
Federated Meta-Learning with Fast Convergence and Efficient Communication
Robust Federated Learning Through Representation Matching and Adaptive Hyper-parameters
Client Adaptation improves Federated Learning with Simulated Non-IID Clients
Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization
SecureBoost: A Lossless Federated Learning Framework
A Quasi-Newton Method Based Vertical Federated Learning Framework for Logistic Regression
Entity Resolution and Federated Learning get a Federated Resolution.
Multi-Participant Multi-Class Vertical Federated Learning
A Communication-Efficient Collaborative Learning Framework for Distributed Features
Asymmetrical Vertical Federated Learning Researcher: Tencent Cloud, Libin Wang
VAFL: a Method of Vertical Asynchronous Federated Learning, ICML workshop on FL, 2020
Central Server Free Federated Learning over Single-sided Trust Social Networks
Multi-consensus Decentralized Accelerated Gradient Descent
Decentralized Bayesian Learning over Graphs. 2019-05
BrainTorrent: A Peer-to-Peer Environment for Decentralized Federated Learning
Biscotti: A Ledger for Private and Secure Peer-to-Peer Machine Learning
Matcha: Speeding Up Decentralized SGD via Matching Decomposition Sampling
Client-Edge-Cloud Hierarchical Federated Learning
Hierarchical Federated Learning Across Heterogeneous Cellular Networks
Enhancing Privacy via Hierarchical Federated Learning
Federated Hierarchical Hybrid Networks for Clickbait Detection
Matcha: Speeding Up Decentralized SGD via Matching Decomposition Sampling (in above section as well)
[FedNAS: Federated Deep Learning via Neural Architecture Search. CVPR 2020. 2020-04-18](https://arxiv.org/pdf/2004.08546.pdf
Real-time Federated Evolutionary Neural Architecture Search. 2020-03
Federated Neural Architecture Search. 2020-06-14
Differentially-private Federated Neural Architecture Search. 2020-06
Secure Federated Transfer Learning. IEEE Intelligent Systems 2018.
FedMD: Heterogenous Federated Learning via Model Distillation
Secure and Efficient Federated Transfer Learning
Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data
Decentralized Differentially Private Segmentation with PATE. 2020-04
Highlights: apply the ICLR 2017 paper "Semisupervised knowledge transfer for deep learning from private training data"
Proxy Experience Replay: Federated Distillation for Distributed Reinforcement Learning. 2020
Cooperative Learning via Federated Distillation over Fading Channels
(*) Cronus: Robust and Heterogeneous Collaborative Learning with Black-Box Knowledge Transfer
Federated Reinforcement Distillation with Proxy Experience Memory
Federated Continual Learning with Adaptive Parameter Communication. 2020-03
Federated Semi-Supervised Learning with Inter-Client Consistency. 2020
Semi-supervised knowledge transfer for deep learning from private training data. ICLR 2017
Scalable private learning with PATE. ICLR 2018.
Federated Adversarial Domain Adaptation. ICLR 2020.
Federated Deep Reinforcement Learning
Differentially Private Federated Variational Inference. NeurIPS 2019 FL Workshop. 2019-11-24.
An Overview of Federated Deep Learning Privacy Attacks and Defensive Strategies. 2020-04-01 Citation: 0
How To Backdoor Federated Learning. 2018-07-02. AISTATS 2020 Citation: 128
Can You Really Backdoor Federated Learning?. NeruIPS 2019. 2019-11-18 Highlight: by Google Citation: 9
DBA: Distributed Backdoor Attacks against Federated Learning. ICLR 2020. Citation: 66
CRFL: Certifiably Robust Federated Learning against Backdoor Attacks. ICML 2021.
Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning. ACM CCS 2017. 2017-02-14 Citation: 284
Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates Citation: 112
Deep Leakage from Gradients. NIPS 2019 Citation: 31
Beyond Inferring Class Representatives: User-Level Privacy Leakage From Federated Learning. INFOCOM 2019 Citation: 56 Highlight: server-side attack
Analyzing Federated Learning through an Adversarial Lens. ICML 2019.. Citation: 60 Highlight: client attack
Mitigating Sybils in Federated Learning Poisoning. 2018-08-14. RAID 2020 Citation: 41 Highlight: defense
RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets, AAAI 2019 Citation: 34
(*) A Framework for Evaluating Gradient Leakage Attacks in Federated Learning. 2020-04-22 Researcher: Wenqi Wei, Ling Liu, GaTech
(*) Local Model Poisoning Attacks to Byzantine-Robust Federated Learning. 2019-11-26
NeurIPS 2020 Submission: Backdoor Attacks on Federated Meta-Learning Researcher: Chien-Lun Chen, USC
Towards Realistic Byzantine-Robust Federated Learning. 2020-04-10
Data Poisoning Attacks on Federated Machine Learning. 2020-04-19
Exploiting Defenses against GAN-Based Feature Inference Attacks in Federated Learning. 2020-04-27
Byzantine-Resilient High-Dimensional SGD with Local Iterations on Heterogeneous Data. 2020-06-22 Researcher: Suhas Diggavi, UCLA (https://scholar.google.com/citations?hl=en&user=hjTzNuQAAAAJ&view_op=list_works&sortby=pubdate)
FDA3 : Federated Defense Against Adversarial Attacks for Cloud-Based IIoT Applications. 2020-06-28
Privacy-preserving Weighted Federated Learning within Oracle-Aided MPC Framework. 2020-05-17 Citation: 0
BASGD: Buffered Asynchronous SGD for Byzantine Learning. 2020-03-02
Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees. 2020-02-25 Citation: 1
Learning to Detect Malicious Clients for Robust Federated Learning. 2020-02-01
Robust Aggregation for Federated Learning. 2019-12-31 Citation: 9
Towards Deep Federated Defenses Against Malware in Cloud Ecosystems. 2019-12-27
Attack-Resistant Federated Learning with Residual-based Reweighting. 2019-12-23
Cronus: Robust and Heterogeneous Collaborative Learning with Black-Box Knowledge Transfer. 2019-12-24 Citation: 1
Free-riders in Federated Learning: Attacks and Defenses. 2019-11-28
Robust Federated Learning with Noisy Communication. 2019-11-01 Citation: 4
Abnormal Client Behavior Detection in Federated Learning. 2019-10-22 Citation: 3
Eavesdrop the Composition Proportion of Training Labels in Federated Learning. 2019-10-14 Citation: 0
Byzantine-Robust Federated Machine Learning through Adaptive Model Averaging. 2019-09-11
Secure Distributed On-Device Learning Networks With Byzantine Adversaries. 2019-06-03 Citation: 3
Robust Federated Training via Collaborative Machine Teaching using Trusted Instances. 2019-05-03 Citation: 2
Dancing in the Dark: Private Multi-Party Machine Learning in an Untrusted Setting. 2018-11-23 Citation: 4
Inverting Gradients - How easy is it to break privacy in federated learning? 2020-03-31 Citation: 3
Quantification of the Leakage in Federated Learning. 2019-10-12 Citation: 1
Practical Secure Aggregation for Federated Learning on User-Held Data. NIPS 2016 workshop Highlight: cryptology
Differentially Private Federated Learning: A Client Level Perspective. NIPS 2017 Workshop
Exploiting Unintended Feature Leakage in Collaborative Learning. S&P 2019. 2018-05-10 Citation: 105
(x) Gradient-Leaks: Understanding and Controlling Deanonymization in Federated Learning. 2018-05
A Hybrid Approach to Privacy-Preserving Federated Learning. AISec 2019. 2018-12-07 Citation: 35
A generic framework for privacy preserving deep learning. PPML 2018. 2018-11-09 Citation: 36
Federated Generative Privacy. IJCAI 2019 FL workshop. 2019-10-08 Citation: 4
Enhancing the Privacy of Federated Learning with Sketching. 2019-11-05 Citaiton: 0
Federated Learning with Bayesian Differential Privacy. 2019-11-22 Citation: 5
HybridAlpha: An Efficient Approach for Privacy-Preserving Federated Learning. AISec 2019. 2019-12-12 https://aisec.cc/
Private Federated Learning with Domain Adaptation. NeurIPS 2019 FL workshop. 2019-12-13
iDLG: Improved Deep Leakage from Gradients. 2020-01-08 Citation: 3
Anonymizing Data for Privacy-Preserving Federated Learning. 2020-02-21
Practical and Bilateral Privacy-preserving Federated Learning. 2020-02-23 Citation: 0
Decentralized Policy-Based Private Analytics. 2020-03-14 Citation: 0
Learn to Forget: User-Level Memorization Elimination in Federated Learning. 2020-03-24
LDP-Fed: Federated Learning with Local Differential Privacy. EdgeSys 2020. 2020-04-01 Researcher: Ling Liu, GaTech Citation: 1
Local Differential Privacy based Federated Learning for Internet of Things. 2020-04-09 Citation: 0
Decentralized Differentially Private Segmentation with PATE. MICCAI 2020 Under Review. 2020-04
Highlights: apply the ICLR 2017 paper "Semisupervised knowledge transfer for deep learning from private training data"
Enhancing Privacy via Hierarchical Federated Learning. 2020-04-23
Privacy Preserving Distributed Machine Learning with Federated Learning. 2020-04-25 Citation: 0
Exploring Private Federated Learning with Laplacian Smoothing. 2020-05-01 Citation: 0
Efficient Privacy Preserving Edge Computing Framework for Image Classification. 2020-05-10 Citation: 0
A Distributed Trust Framework for Privacy-Preserving Machine Learning. 2020-06-03 Citation: 0
Secure Byzantine-Robust Machine Learning. 2020-06-08
ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing. 2020-06-08
GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators. 2020-06-15 Citation: 0
Federated Learning with Differential Privacy:Algorithms and Performance Analysis Citation: 2
Fair Resource Allocation in Federated Learning. ICLR 2020.
Hierarchically Fair Federated Learning
Towards Fair and Privacy-Preserving Federated Deep Models
Interpret Federated Learning with Shapley Values.
FMore: An Incentive Scheme of Multi-dimensional Auction for Federated Learning in MEC. ICDCS 2020
Toward an Automated Auction Framework for Wireless Federated Learning Services Market
Federated Learning for Edge Networks: Resource Optimization and Incentive Mechanism
Motivating Workers in Federated Learning: A Stackelberg Game Perspective
Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach
A Learning-based Incentive Mechanism forFederated Learning
A Crowdsourcing Framework for On-Device Federated Learning
System Challenges: communication and computational resource constrained, software and hardware heterogeneity, and FL wireless communication system
Federated Learning: Strategies for Improving Communication Efficiency Highlights: optimization
Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training. ICLR 2018. 2017-12-05 Highlights: gradient compression Citation: 298
NeurIPS 2020 submission: Artemis: tight convergence guarantees for bidirectional compression in Federated Learning. 2020-06-25 Highlights: bidirectional gradient compression
Scheduling Policy and Power Allocation for Federated Learning in NOMA Based MEC. 2020-06-21
(x) Federated Mutual Learning. 2020-06-27 Highlights: Duplicate to Deep Mutual Learning. CVPR 2018
A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning. 2020-06-19 Researcher: Peter Richtárik
Federated Learning With Quantized Global Model Updates. 2020-06-18 Researcher: Mohammad Mohammadi Amiri, Princeton, Information Theory and Machine Learning Highlights: model compression
Federated Learning with Compression: Unified Analysis and Sharp Guarantees. 2020-07-02 Highlight: non-IID, gradient compression + local SGD Researcher: Mehrdad Mahdavi, Jin Rong’s PhD http://www.cse.psu.edu/~mzm616/
Evaluating the Communication Efficiency in Federated Learning Algorithm. 2020-04-06
Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning. 2020-05-21
Ternary Compression for Communication-Efficient Federated Learning. 2020-05-07
Gradient Statistics Aware Power Control for Over-the-Air Federated Learning. 2020-05-04
(*) RPN: A Residual Pooling Network for Efficient Federated Learning. ECAI 2020.
Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning. 2019-11-12
L-FGADMM: Layer-Wise Federated Group ADMM for Communication Efficient Decentralized Deep Learning
Gradient Sparification for Asynchronous Distributed Training. 2019-10-24
High-Dimensional Stochastic Gradient Quantization for Communication-Efficient Edge Learning
SAFA: a Semi-Asynchronous Protocol for Fast Federated Learning with Low Overhead
Detailed comparison of communication efficiency of split learning and federated learning
Decentralized Federated Learning: A Segmented Gossip Approach
Multi-objective Evolutionary Federated Learning
Expanding the Reach of Federated Learning by Reducing Client Resource Requirements
Partitioned Variational Inference: A unified framework encompassing federated and continual learning
FedOpt: Towards communication efficiency and privacy preservation in federated learning
A performance evaluation of federated learning algorithms
Coded Federated Learning. Presented at the Wireless Edge Intelligence Workshop, IEEE GLOBECOM 2019
Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning
Coded Federated Computing in Wireless Networks with Straggling Devices and Imperfect CSI
Information-Theoretic Perspective of Federated Learning
NeurIPS 2020 Submission: Distributed Learning on Heterogeneous Resource-Constrained Devices
SplitFed: When Federated Learning Meets Split Learning
Lottery Hypothesis based Unsupervised Pre-training for Model Compression in Federated Learning
Secure Federated Learning in 5G Mobile Networks. 2020/04
ELFISH: Resource-Aware Federated Learning on Heterogeneous Edge Devices
Asynchronous Online Federated Learning for Edge Devices
(*) Secure Federated Submodel Learning
Federated Neuromorphic Learning of Spiking Neural Networks for Low-Power Edge Intelligence
Model Pruning Enables Efficient Federated Learning on Edge Devices
Towards Effective Device-Aware Federated Learning
Accelerating DNN Training in Wireless Federated Edge Learning System
Split learning for health: Distributed deep learning without sharing raw patient data
SmartPC: Hierarchical pace control in real-time federated learning system
DeCaf: Iterative collaborative processing over the edge
Researcher: H. Vincent Poor https://ee.princeton.edu/people/h-vincent-poor
Hao Ye https://scholar.google.ca/citations?user=ok7OWEAAAAAJ&hl=en
Ye Li http://liye.ece.gatech.edu/
Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup Researcher: Mehdi Bennis, Seong-Lyun Kim
Wireless Communications for Collaborative Federated Learning in the Internet of Things
Democratizing the Edge: A Pervasive Edge Computing Framework
UVeQFed: Universal Vector Quantization for Federated Learning
Federated Deep Learning Framework For Hybrid Beamforming in mm-Wave Massive MIMO
Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints
A Secure Federated Learning Framework for 5G Networks
Federated Learning and Wireless Communications
Lightwave Power Transfer for Federated Learning-based Wireless Networks
Towards Ubiquitous AI in 6G with Federated Learning
Optimizing Over-the-Air Computation in IRS-Aided C-RAN Systems
Network-Aware Optimization of Distributed Learning for Fog Computing
On the Design of Communication Efficient Federated Learning over Wireless Networks
Federated Machine Learning for Intelligent IoT via Reconfigurable Intelligent Surface
A Blockchain-based Decentralized Federated Learning Framework with Committee Consensus
Scheduling for Cellular Federated Edge Learning with Importance and Channel. 2020-04
Differentially Private Federated Learning for Resource-Constrained Internet of Things. 2020-03
Gradient Estimation for Federated Learning over Massive MIMO Communication Systems
Adaptive Federated Learning With Gradient Compression in Uplink NOMA
Performance Analysis and Optimization in Privacy-Preserving Federated Learning
Energy-Efficient Federated Edge Learning with Joint Communication and Computation Design
Federated Over-the-Air Subspace Learning and Tracking from Incomplete Data
Decentralized Federated Learning via SGD over Wireless D2D Networks
Federated Learning in the Sky: Joint Power Allocation and Scheduling with UAV Swarms
Wireless Federated Learning with Local Differential Privacy
Cooperative Learning via Federated Distillation over Fading Channels
Learning from Peers at the Wireless Edge
Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge
Communication Efficient Federated Learning over Multiple Access Channels
Convergence Time Optimization for Federated Learning over Wireless Networks
Asynchronous Federated Learning with Differential Privacy for Edge Intelligence
Federated learning with multichannel ALOHA
Federated Learning with Autotuned Communication-Efficient Secure Aggregation
Bandwidth Slicing to Boost Federated Learning in Edge Computing
Energy Efficient Federated Learning Over Wireless Communication Networks
Device Scheduling with Fast Convergence for Wireless Federated Learning
Energy-Aware Analog Aggregation for Federated Learning with Redundant Data
Age-Based Scheduling Policy for Federated Learning in Mobile Edge Networks
Federated Learning over Wireless Networks: Convergence Analysis and Resource Allocation
Federated Learning over Wireless Networks: Optimization Model Design and Analysis
Reliable Federated Learning for Mobile Networks
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Cell-Free Massive MIMO for Wireless Federated Learning
A Joint Learning and Communications Framework for Federated Learning over Wireless Networks
On Safeguarding Privacy and Security in the Framework of Federated Learning
On Safeguarding Privacy and Security in the Framework of Federated Learning
Hierarchical Federated Learning Across Heterogeneous Cellular Networks
Federated Learning for Wireless Communications: Motivation, Opportunities and Challenges
Scheduling Policies for Federated Learning in Wireless Networks
Federated Learning with Additional Mechanisms on Clients to Reduce Communication Costs
Federated Learning over Wireless Fading Channels
Energy-Efficient Radio Resource Allocation for Federated Edge Learning
Active Learning Solution on Distributed Edge Computing
Fast Uplink Grant for NOMA: a Federated Learning based Approach
Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air
Federated Learning via Over-the-Air Computation
Broadband Analog Aggregation for Low-Latency Federated Edge Learning
Federated Echo State Learning for Minimizing Breaks in Presence in Wireless Virtual Reality Networks
Joint Service Pricing and Cooperative Relay Communication for Federated Learning
In-Edge AI: Intelligentizing Mobile Edge Computing, Caching and Communication by Federated Learning
Asynchronous Task Allocation for Federated and Parallelized Mobile Edge Learning
[CoLearn: enabling federated learning in MUD-compliant IoT edge networks](CoLearn: enabling federated learning in MUD-compliant IoT edge networks)
Towards Federated Learning at Scale: System Design
FedML: A Research Library and Benchmark for Federated Machine Learning
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
FLeet: Online Federated Learning via Staleness Awareness and Performance Prediction Researcher: Georgios Damaskinos, MLSys, https://people.epfl.ch/georgios.damaskinos?lang=en
Heterogeneity-Aware Federated Learning Researcher: Mengwei Xu, PKU
Responsive Web User Interface to Recover Training Data from User Gradients in Federated Learning https://ldp-machine-learning.herokuapp.com/
Decentralised Learning from Independent Multi-Domain Labels for Person Re-Identification
[startup] Industrial Federated Learning -- Requirements and System Design
Quantifying the Performance of Federated Transfer Learning
ELFISH: Resource-Aware Federated Learning on Heterogeneous Edge Devices
Privacy is What We Care About: Experimental Investigation of Federated Learning on Edge Devices
Substra: a framework for privacy-preserving, traceable and collaborative Machine Learning
BAFFLE : Blockchain Based Aggregator Free Federated Learning
Edge AIBench: Towards Comprehensive End-to-end Edge Computing Benchmarking
Functional Federated Learning in Erlang (ffl-erl)
HierTrain: Fast Hierarchical Edge AI Learning With Hybrid Parallelism in Mobile-Edge-Cloud Computing
Generative Models for Effective ML on Private, Decentralized Datasets. Google. ICLR 2020 Citation: 8
MD-GAN: Multi-Discriminator Generative Adversarial Networks for Distributed Datasets. 2018-11-09
(GAN) Federated Generative Adversarial Learning. 2020-05-07 Citation: 0
Differentially Private Data Generative Models
GRAFFL: Gradient-free Federated Learning of a Bayesian Generative Model
(VAE) An On-Device Federated Learning Approach for Cooperative Anomaly Detection
Secure Federated Matrix Factorization
Privacy Threats Against Federated Matrix Factorization
Practical Federated Gradient Boosting Decision Trees. AAAI 2020.
Federated Extra-Trees with Privacy Preserving
SecureGBM: Secure Multi-Party Gradient Boosting
The Tradeoff Between Privacy and Accuracy in Anomaly Detection Using Federated XGBoost
Privacy Preserving QoE Modeling using Collaborative Learning
Federated pretraining and fine tuning of BERT using clinical notes from multiple silos
Federated Learning for Mobile Keyboard Prediction
Federated Learning for Keyword Spotting
generative sequence models (e.g., language models)
Pretraining Federated Text Models for Next Word Prediction
FedNER: Privacy-preserving Medical Named Entity Recognition with Federated Learning. MSRA. 2020-03.
Federated Learning of N-gram Language Models. Google. ACL 2019.
Federated User Representation Learning
Two-stage Federated Phenotyping and Patient Representation Learning
Federated Learning for Emoji Prediction in a Mobile Keyboard
Federated AI lets a team imagine together: Federated Learning of GANs
Federated Learning Of Out-Of-Vocabulary Words
Learning Private Neural Language Modeling with Attentive Aggregation
Applied Federated Learning: Improving Google Keyboard Query Suggestions
Federated Learning for Ranking Browser History Suggestions
(*) Federated Visual Classification with Real-World Data Distribution. MIT. ECCV 2020. 2020-03
FedVision: An Online Visual Object Detection Platform Powered by Federated Learning
A Federated Learning Framework for Healthcare IoT devices Keywords: Split Learning + Sparsification
Federated Transfer Learning for EEG Signal Classification
The Future of Digital Health with Federated Learning
Anonymizing Data for Privacy-Preserving Federated Learning. ECAI 2020.
Federated machine learning with Anonymous Random Hybridization (FeARH) on medical records
Stratified cross-validation for unbiased and privacy-preserving federated learning
Learn Electronic Health Records by Fully Decentralized Federated Learning
Preserving Patient Privacy while Training a Predictive Model of In-hospital Mortality
Federated Learning for Healthcare Informatics
Federated and Differentially Private Learning for Electronic Health Records
A blockchain-orchestrated Federated Learning architecture for healthcare consortia
Federated Uncertainty-Aware Learning for Distributed Hospital EHR Data
Stochastic Channel-Based Federated Learning for Medical Data Privacy Preserving
Differential Privacy-enabled Federated Learning for Sensitive Health Data
Privacy Preserving Stochastic Channel-Based Federated Learning with Neural Network Pruning
Privacy-preserving Federated Brain Tumour Segmentation
HHHFL: Hierarchical Heterogeneous Horizontal Federated Learning for Electroencephalography
FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
LoAdaBoost:Loss-Based AdaBoost Federated Machine Learning on medical Data
FADL:Federated-Autonomous Deep Learning for Distributed Electronic Health Record
Federated Learning for Vehicular Networks
Federated Learning Meets Contract Theory: Energy-Efficient Framework for Electric Vehicle Networks
Beyond privacy regulations: an ethical approach to data usage in transportation. TomTom. 2020-04-01
Privacy-preserving Traffic Flow Prediction: A Federated Learning Approach
Practical Privacy Preserving POI Recommendation
Federated Learning for Localization: A Privacy-Preserving Crowdsourcing Method
Federated Transfer Reinforcement Learning for Autonomous Driving
Energy Demand Prediction with Federated Learning for Electric Vehicle Networks
Distributed Federated Learning for Ultra-Reliable Low-Latency Vehicular Communications
Federated Learning for Ultra-Reliable Low-Latency V2V Communications
Federated Learning in Vehicular Edge Computing: A Selective Model Aggregation Approach
(*) Federated Multi-view Matrix Factorization for Personalized Recommendations
Robust Federated Recommendation System
Federated Recommendation System via Differential Privacy
FedRec: Privacy-Preserving News Recommendation with Federated Learning. MSRA. 2020-03
Federating Recommendations Using Differentially Private Prototypes
Meta Matrix Factorization for Federated Rating Predictions
Federated Hierarchical Hybrid Networks for Clickbait Detection
Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System
Training Keyword Spotting Models on Non-IID Data with Federated Learning
FedCoin: A Peer-to-Peer Payment System for Federated Learning
Towards Federated Graph Learning for Collaborative Financial Crimes Detection
Cloud-based Federated Boosting for Mobile Crowdsensing
Exploiting Unlabeled Data in Smart Cities using Federated Learning
A Federated Learning Approach for Mobile Packet Classification
Blockchained On-Device Federated Learning
Boosting Privately: Privacy-Preserving Federated Extreme Boosting for Mobile Crowdsensing
Self-supervised audio representation learning for mobile devices
PMF: A Privacy-preserving Human Mobility Prediction Framework via Federated Learning
Federated Multi-task Hierarchical Attention Model for Sensor Analytics
DÏoT: A Federated Self-learning Anomaly Detection System for IoT
The OARF Benchmark Suite: Characterization and Implications for Federated Learning Systems
Evaluation Framework For Large-scale Federated Learning
(*) PrivacyFL: A simulator for privacy-preserving and secure federated learning. MIT CSAIL.
Revocable Federated Learning: A Benchmark of Federated Forest
Real-World Image Datasets for Federated Learning
LEAF: A Benchmark for Federated Settings
Functional Federated Learning in Erlang (ffl-erl)
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
Researcher: Bingsheng He, NUS Qinbin Li, PhD, NUS, HKUST
SECure: A Social and Environmental Certificate for AI Systems
Federated Learning for 6G Communications: Challenges, Methods, and Future Directions
A Review of Privacy Preserving Federated Learning for Private IoT Analytics
Survey of Personalization Techniques for Federated Learning. 2020-03-19
Threats to Federated Learning: A Survey
Towards Utilizing Unlabeled Data in Federated Learning: A Survey and Prospective
Federated Learning for Resource-Constrained IoT Devices: Panoramas and State-of-the-art
Advances and Open Problems in Federated Learning
Privacy-Preserving Blockchain Based Federated Learning with Differential Data Sharing
An Introduction to Communication Efficient Edge Machine Learning
Federated Learning for Healthcare Informatics
Federated Learning for Coalition Operations
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
Federated Learning: Challenges, Methods, and Future Directions
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
Federated Machine Learning: Concept and Applications