"Wasserstein Dropout" - Predict Uncertainties in Regression Neural Net
This paper proposes new approach called "Wasserstein dropout" to quantify uncertainty for regression tasks in neural networks. Unlike state-of-the-art methods, it is purely non-parametric and captures aleatoric uncertainty by dropout sub-network distributions. The method minimizes the Wasserstein distance between the label distribution and the model distribution. The results of an empirical analysis show that Wasserstein dropout produces more accurate and stable uncertainty estimates compared to state-of-the-art methods and outperforms them, on standard test data and under distributional shift