Papers of the day   All papers

On Variational Bounds of Mutual Information

Comments

Jun 25 2019 Ben Poole

Want to estimate or optimize mutual information using neural networks and the latest variational bounds? Check out our Colab notebook for implementations and experiments! Colab: https://colab.research.google.com/github/google-research/google-research/blob/master/vbmi/vbmi_demo.ipynb Paper: https://arxiv.org/abs/1905.06922 https://t.co/WcvYzSoZfX
1 replies, 314 likes


May 17 2019 Sherjil Ozair

Our ICML'19 paper is out on ArXiv! "On Variational Bounds of Mutual Information". Link: https://arxiv.org/abs/1905.06922 We unify various existing and new variational bounds of mutual information in a single framework, and analyzed the tradeoffs between the various bounds. https://t.co/H0KoZBtg3z
1 replies, 309 likes


Jun 12 2019 Ben Poole

Come learn about variational bounds of mutual information tomorrow (Thursday) at #ICML2019, 4:40pm in the Grand Ballroom or drop by poster #86 at 6:30pm! Joint work w/awesome collaborators @sherjilozair @avdnoord @alemi @georgejtucker https://arxiv.org/abs/1905.06922 https://t.co/czgCB0naaN
0 replies, 167 likes


May 17 2019 Miles Brundage

"On Variational Bounds of Mutual Information," @poolio et al.: https://arxiv.org/abs/1905.06922
1 replies, 76 likes


May 31 2019 Loic Matthey

Highly recommend reading this paper! Thoroughly enjoyed it and made many concepts much much clearer. It really seems like we have an amazing toolbox of bounds available now, cant wait to use that! Thanks @poolio @sherjilozair for the insights.
0 replies, 58 likes


May 17 2019 Statistics Papers

On Variational Bounds of Mutual Information. http://arxiv.org/abs/1905.06922
0 replies, 26 likes


Jun 12 2019 Mario Lucic

Great work relevant for anyone interested in representation learning.
0 replies, 14 likes


Nov 06 2019 Shane Gu

A great summary + new insight paper from Ben, George et al on mutual information! https://arxiv.org/abs/1905.06922
0 replies, 7 likes


May 24 2019 Daisuke Okanohara

This paper nicely summarizes many variants of variational bounds of mutual information. All estimations are high bias or high variance, and it is an open problem to find low-bias, low-variance estimation (with some reasonable conditions) https://arxiv.org/abs/1905.06922
0 replies, 6 likes


Content