Skip to content

Wasserstein dropout (W-dropout) is a novel technique to quantify uncertainty in regression networks. It is fully non-parametric and yields accurate uncertainty estimates - even under data shifts.

License

Notifications You must be signed in to change notification settings

fraunhofer-iais/wasserstein-dropout

Repository files navigation

Wasserstein Dropout

This code repository accompanies our publication "Wasserstein dropout".

Overview

The repository is structured as follows: Code for training, evaluation and benchmarking of our Wasserstein dropout technique on standard 1D regression datasets can be found in the subdirectory 01_wdropout_on_standard_1d_regression_datasets. For a teaser plot illustrating the benefits of Wasserstein dropout on 1D toy data, see below.

The second subdirectory 02_wdropout_for_object_detection contains code for training and evaluating a W-dropout-enhanced version of the object detection (OD) architecture SqueezeDet. This W-SqueezeDet model is moreover compared to an MC-dropout-enhanced version of the same OD network.

Comparison of Wasserstein dropout and MC dropout on 1D toy data

Comparison of Wasserstein dropout (left-hand side) and MC dropout (right-hand side) on two 1D toy datasets.

License

The Wasserstein dropout code is released under the MIT license.

Citing Wasserstein Dropout

If you use or reference Wasserstein dropout in your research, please use the following BibTeX entry:

@article{wdropout,
author = {Sicking, Joachim and Akila, Maram and Pintz, Maximilian and Wirtz, Tim and Wrobel, Stefan and Fischer, Asja},
title = {Wasserstein Dropout},
journal = {Accepted for publication in Machine Learning},
publisher = {Springer},
year = {2022}
}

About

Wasserstein dropout (W-dropout) is a novel technique to quantify uncertainty in regression networks. It is fully non-parametric and yields accurate uncertainty estimates - even under data shifts.

Topics

Resources

License

Stars

Watchers

Forks