Skip to content

Latest commit

 

History

History
36 lines (23 loc) · 1.13 KB

README.md

File metadata and controls

36 lines (23 loc) · 1.13 KB

This repository is an unofficial PyTorch implementation of:

Anti Distillation Backdoor Attacks(MM '21)

which is closed source.:fu:

Requirements

  • This codebase is written for python3 (used python 3.6 while implementing).
  • We use Pytorch version of 1.8.2, 11.4 CUDA version.
  • To install necessary python packages,
conda env create -f environment.yml
pip install -r requirements.txt

How to Run Codes?

Local Training

python3 main.py --type=pretrain  --lr=0.001 --model=res --dataset=cifar10 --partition=iid --seed=1 --local_ep=200  --alpha_backdoor=0.4 --txtpath=./saved/adba_result.txt

Global Distillation

python3 main.py --type=distillation  --lr=0.001 --model=res --dataset=cifar10 --partition=iid --seed=1 --epochs=200  --alpha_backdoor=0.4 --txtpath=./saved/adba_result.txt

Acknowledgement

This repository is based on the code of DENSE(NeurIPS '22), and I also learn a lot from it when coding.🥰