Papers of the day   All papers

How Good is the Bayes Posterior in Deep Neural Networks Really?

Comments

Feb 07 2020 Guodong Zhang.

A comprehensive study on Bayesian inference in DNNs. I guess only within Google can you conduct such careful experiments, interesting read! Take-away: Bayesian posterior is rather poor and prior seems to be a big problem (don't scale to large nets). https://arxiv.org/abs/2002.02405
6 replies, 490 likes


Feb 07 2020 Ilya Sutskever

https://arxiv.org/abs/2002.02405 — careful and expensive MCMC Bayesian inference over NN parameters is *worse* than point estimates or low temperature posteriors. Supports @carlesgelada and @jacobmbuckman’s view that Bayesian NNs are not meaningful probably because the prior is wrong.
11 replies, 412 likes


Feb 08 2020 hardmaru

How Good is the Bayes Posterior in Deep Neural Networks Really? “Despite its promise of improved uncertainty quantification and sample efficiency there are—as of early 2020—no publicized deployments of Bayesian neural networks in industrial practice.” https://arxiv.org/abs/2002.02405 https://t.co/3gvxj2BnWf
8 replies, 321 likes


Feb 15 2020 Sebastian Raschka

"In this work we cast doubt on the current understanding of Bayes posteriors in popular deep neural networks: [...] the Bayes posterior yields systematically worse predictions compared to simpler methods including point estimates obtained from SGD." https://arxiv.org/abs/2002.02405
2 replies, 74 likes


Feb 07 2020 Statistics Papers

How Good is the Bayes Posterior in Deep Neural Networks Really?. http://arxiv.org/abs/2002.02405
0 replies, 23 likes


Feb 08 2020 Rémi Louf 👾🛸✨

This is how debates around BNNs should happen: not by waving hands but with presenting some hard facts. Papers like this open a discussion instead of fueling a war. Kudos to the authors! https://arxiv.org/abs/2002.02405
0 replies, 10 likes


Feb 07 2020 Zhaoran Wang

no wonder why SGLD hardly works for exploration in RL
0 replies, 8 likes


Feb 07 2020 reza mahmoudi

مقاله ی فوق العاده دیگه How Good is the Bayes Posterior in Deep Neural Networks Really? http://arxiv.org/abs/2002.02405 #MachineLearning #artificalintelligence #DeepLearning
0 replies, 7 likes


Feb 07 2020 Pranav Shyam

"Our work questions the goal of accurate posterior approximations in Bayesian deep learning: If the true Bayes posterior is poor, what is the use of more accurate approximations?" 🔥🔥🔥 https://arxiv.org/abs/2002.02405
0 replies, 6 likes


Feb 07 2020 Hugh Harvey

I have tried to read this twice, but my mathematical understanding is not good enough. That said, I think you can just sum this paper up as "no-one really understands Bayes"
1 replies, 4 likes


Feb 08 2020 Machine Learning Tweet Feed

How Good is the Bayes Posterior in Deep Neural Networks Really? https://arxiv.org/abs/2002.02405 Excellent empirical and in-depth paper on further understanding the behavior of why "cold posteriors" perform better than the Bayes.
0 replies, 2 likes


Feb 07 2020 Piotr Sokol

How Good is the Bayes Posterior in Deep Neural Networks Really?. (arXiv:2002.02405v1 [http://stat.ML]) http://arxiv.org/abs/2002.02405
0 replies, 2 likes


Feb 07 2020 Sebastian Bodenstein

This is an important, if disappointing, conclusion.
0 replies, 2 likes


Feb 08 2020 Jason M Pittman

[R] How Good is the Bayes Posterior in Deep Neural Networks Really? https://arxiv.org/abs/2002.02405 #MachineLearning
0 replies, 1 likes


Feb 07 2020 Raphael cohen

"Posteriors are best served cold"
0 replies, 1 likes


Content