Skip to content

Code to reproduce the experiments of "On the Byzantine-Resilience of Distillation-Based Federated Learning"

Notifications You must be signed in to change notification settings

ZIB-IOL/FedDistill

Repository files navigation

On the Byzantine-Resilience of Distillation-Based Federated Learning

Authors: Christophe Roux, Max Zimmer, Sebastian Pokutta

This repository contains the code to reproduce the experiments from the paper "On the Byzantine-Resilience of Distillation-Based Federated Learning". The code is based on PyTorch 1.9 and the experiment-tracking platform Weights & Biases.

Structure and Usage

Structure

Experiments are started from the following file:

  • main.py: Starts experiments using the dictionary format of Weights & Biases.

The rest of the project is structured as follows:

  • byzantine: Contains the attacks and defenses used in the paper.
  • runners: Contains classes to control the training and collection of metrics.
  • models: Contains all model architectures used.
  • utilities.py: Contains useful auxiliary functions and classes.
  • config.py: Configuration for the datasets used in the experiments.
  • public_config.py: Contains the configuration for the public datasets.
  • metrics.py: Contains the metrics used in the experiments.
  • strategies.py: Contains the different strategies used, such as FedAVG and FedDistill.

Usage

Define the parameters in the main.py defaults-dictionary and run it with the --debug flag. Or, configure a sweep in Weights & Biases and run it from there (without the flag).

Citation

In case you find the paper or the implementation useful for your own research, please consider citing:

@article{roux2024byzantine,
  author = {Roux, Christophe and Zimmer, Max and Pokutta, Sebastian},
  title = {On the Byzantine-Resilience of Distillation-Based Federated Learning},
  year = {2024},
  journal = {arXiv preprint arXiv:2402.12265},
}

About

Code to reproduce the experiments of "On the Byzantine-Resilience of Distillation-Based Federated Learning"

Topics

Resources

Stars

Watchers

Forks