Skip to content

ICLR 2023 paper "Exploring and Exploiting Decision Boundary Dynamics for Adversarial Robustness" by Yuancheng Xu, Yanchao Sun, Micah Goldblum, Tom Goldstein and Furong Huang

License

Notifications You must be signed in to change notification settings

umd-huang-lab/Dynamics-Aware-Robust-Training

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamics-Aware-Robust-Training

The codebase for the paper "Exploring and Exploiting Decision Boundary Dynamics for Adversarial Robustness" (ICLR 2023, https://arxiv.org/abs/2302.03015) by Yuancheng Xu, Yanchao Sun, Micah Goldblum, Tom Goldstein and Furong Huang.

The implementation of DyART (Dynamics-aware robust training) is provided.

Overview

  • During training, the decision boundary moves in the input space. Our framework provides a closed-form expression for the relative speed of the decision boundary w.r.t. any data point, which characterizes the decision boundary dynamics.
  • Margin, the distance from the decision boundary to the data point in the input space, is a fundamental quantity in machine learning. We provide a closed-from expression that explicitly computes the margin gradients w.r.t. the neural network parameters.
  • DyART achieves adversarial robustness by directly following the margin gradients during training, in contrast with previous SOTA adversarial training methods based on the min-max framework.
  • With 10M additional synthetic data, DyART achieves 93.69% clean accuracy and 63.89% Linf robust accuracy using WRN-28-10 on CIFAR-10, which ranks 2nd on the RobustBench Leaderboard under the same neural architecture as of May, 2023.

Environment set-up

  • Create a new environment using the .yml file
conda env create -f environment.yml
  • Install AutoAttack for evaluation
pip install git+https://github.com/fra31/auto-attack

Data Preparation (optional)

Tiny-ImageNet

Dowload the Tiny-ImageNet dataset via the following

bash data/TinyImageNet-200.sh

Additional data for CIFAR-10

Rebuffi et al. (2021), Gowal et al. (2021) and Wang et al. (2023) use samples generated by diffusion models to improve robustness. The generated model is solely trained on the original training data. You can download the generated data for CIFAR-10 here (generated by DDPM) or here (generated by EDM). You need to put the downloaded file cifar10_ddpm.npz under the folder data/.

Running DyART

The .scripts/ folder includes bash scripts for running DyART on CIFAR-10 and Tiny-ImageNet.

CIFAR-10

First run 10 epochs for the burn-in period.

bash scripts/Cifar10_clean_training.sh

Then run DyART without additional data

bash scripts/Cifar10_DyART.sh

Or run DyART with additional data

bash scripts/Cifar10_DyART_additional_data.sh

Tiny-ImageNet

First run 20 epochs for the burn-in period

bash scripts/TinyImgNet_clean_training.sh

Then run DyART

bash scripts/TinyImgNet_DyART.sh

Resuming from previous checkpoints

Each experiment will generate a folder, containing the parameters of the experiment, best checkpoint, the most recent checkpoint and a log. To resume an unfinished experiment, put the folder directory into scripts/resume.sh and run

bash scripts/resume.sh

AutoAttack Evaluation

Provide the experiment folder directory in scripts/eval.sh and run

bash scripts/eval.sh

Citation

@inproceedings{
xu2023exploring,
title={Exploring and Exploiting Decision Boundary Dynamics for Adversarial Robustness},
author={Yuancheng Xu and Yanchao Sun and Micah Goldblum and Tom Goldstein and Furong Huang},
booktitle={International Conference on Learning Representations},
year={2023},
url={https://arxiv.org/abs/2302.03015}
}

About

ICLR 2023 paper "Exploring and Exploiting Decision Boundary Dynamics for Adversarial Robustness" by Yuancheng Xu, Yanchao Sun, Micah Goldblum, Tom Goldstein and Furong Huang

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published