Skip to content

Code for the paper 'An Information Theoretic Approach to Machine Unlearning'

Notifications You must be signed in to change notification settings

jwf40/Information-Theoretic-Unlearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

An Information Theoretic Approach to Unlearning

GitHub last commit (branch) GitHub Repo stars GitHub repo size

SSD_heading

This is the code for the paper An Information Theoretic Approach to Machine Unlearning.

Usage

Experiments can be run via the shell script (0 is the GPU number, feel free to change on a multi-gpu setup; 1 is the seed - set to any int).

./cifar100_fullclass_exps_vgg.sh 0 1

You might encounter issues with executing this file due to different line endings with Windows and Unix. Use dos2unix "filename" to fix.

Setup

You will need to either download the weights we used for our models from here, or train VGG16 and Vision Transformers from scratch using pretrain_model.py for this and then copy the paths of the models into the respecive .sh files. Important: Due to neural net's propensity to learn smooth functions first, please make sure to train the models for a sufficient amount of time. Premature stopping significantly impacts unlearning performance.

# fill in _ with your desired parameters as described in pretrain_model.py
python pretrain_model.py -net _ -dataset _ -classes _ -gpu _

We used https://hub.docker.com/layers/tensorflow/tensorflow/latest-gpu-py3-jupyter/images/sha256-901b827b19d14aa0dd79ebbd45f410ee9dbfa209f6a4db71041b5b8ae144fea5 as our base image and installed relevant packages on top.

datetime
wandb
sklearn
torch
copy
tqdm
transformers
matplotlib
scipy

You will need a wandb.ai account to use the implemented logging. Feel free to replace with any other logger of your choice.

Modifying JiT unlearning

JiT functions are in Lipschitz.py, and is referred to throughout as lipschitz_forgetting. To change sigma and eta, set them in the respective forget_..._main.py file per unlearning task.

Authors

For our newest research, feel free to follow our socials:

Jack Foster: LinkedIn, Twitter , Scholar

Kyle Fogarty: LinkedIn, Twitter, Scholar

Stefan Schoepf: LinkedIn, Twitter, Scholar

Cengiz Öztireli: LinkedIn, Scholar

Alexandra Brintrup: LinkedIn, Scholar

Supply Chain AI Lab: LinkedIn

About

Code for the paper 'An Information Theoretic Approach to Machine Unlearning'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published