Skip to content

PyTorch implementation of Joint Privacy Enhancement and Quantization in Federated Learning (IEEE TSP 2023, IEEE ICASSP 2023, IEEE ISIT 2022)

License

Notifications You must be signed in to change notification settings

langnatalie/JoPEQ

Repository files navigation

Joint Privacy Enhancement and Quantization in Federated Learning

image

Introduction

In this work we propose a method for joint privacy enhancement and quantization (JoPEQ), unifying lossy compression and privacy enhancement for federated learning. This repository contains a basic PyTorch implementation of JoPEQ. Please refer to our paper for more details. We note that the explicit use of the characteristic functions described in equations (14) and (15) of the paper leads to stability issues, with the performance depending much on how the frequency domain is being sampled. For this reason, the provided code uses Laplacian PPN while accounting for the quantization distortion in the second-order moments. It can be numerically verified that the resulting distribution is close to the desired Laplace.

Usage

This code has been tested on Python 3.7.3, PyTorch 1.8.0 and CUDA 11.1.

Prerequisite

  1. PyTorch=1.8.0: https://pytorch.org
  2. scipy
  3. tqdm
  4. matplotlib
  5. torchinfo
  6. TensorboardX: https://github.com/lanpa/tensorboardX

Training

python main.py --exp_name=jopeq --quntization --lattice_dim 2 --R 1 --privacy --privacy_noise jopeq_vector --epsilon 4 --sigma_squared 0.2 --nu 4

Testing

python main.py --exp_name=jopeq --eval 

About

PyTorch implementation of Joint Privacy Enhancement and Quantization in Federated Learning (IEEE TSP 2023, IEEE ICASSP 2023, IEEE ISIT 2022)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages