"Wasserstein Dropout" - Predict Uncertainties in Regression Neural Net

No alt text provided for this image
Performance comparison from "Wasserstein Dropout" (https://arxiv.org/pdf/2012.12687.pdf)


This paper proposes new approach called "Wasserstein dropout" to quantify uncertainty for regression tasks in neural networks. Unlike state-of-the-art methods, it is purely non-parametric and captures aleatoric uncertainty by dropout sub-network distributions. The method minimizes the Wasserstein distance between the label distribution and the model distribution. The results of an empirical analysis show that Wasserstein dropout produces more accurate and stable uncertainty estimates compared to state-of-the-art methods and outperforms them, on standard test data and under distributional shift        


要查看或添加评论,请登录

Mike Bowles的更多文章

  • Deep learning tools for drug design

    Deep learning tools for drug design

    Collaborative Drug Discovery (a company I advise) has won a Phase 1 NIH grant to develop deep learning tools for drug…

    1 条评论

社区洞察

其他会员也浏览了