Papers of the day   All papers

Efficient Graph Generation with Graph Recurrent Attention Networks

Comments

Oct 03 2019 David Duvenaud

I heard you like graphs, so we put a graph neural net in your graph generative model, so you can be invariant to order while you add edges to your graph. Scales to 5000 nodes. Paper: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN https://t.co/FFyuYqjy3q
3 replies, 619 likes


Oct 03 2019 Renjie Liao

Happy to share our #NeurIPS2019 paper on generating graphs (~5K nodes) with graph recurrent attention networks (GRAN). It scales much better and achieves SOTA performance and very impressive sample-quality. https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN https://t.co/jIFWvGlgqn
1 replies, 117 likes


Oct 03 2019 Yujia Li

New paper at #NeurIPS2019 on generative models of graphs. We explored many quality-efficiency trade-offs in this work and came up with a new model that gets good graph generation quality with much better efficiency. Paper: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN
1 replies, 112 likes


Oct 03 2019 Charlie Nash

Sampling efficiency is a pain point for autoregressive models. GRAN is a generative model of graphs that enables faster sampling while achieving great performance. Impressive looking samples!
0 replies, 23 likes


Oct 22 2019 Connected Data LDN

Efficient Graph Generation with Graph Recurrent Attention Networks. #BigData #Analytics #DataScience #AI #MachineLearning #IoT #IIoT #PyTorch #Python #RStats #TensorFlow #Java #CloudComputing #Serverless #DataScience via @gp_pulipaka https://arxiv.org/pdf/1910.00760.pdf https://t.co/RG4vfPywQQ
0 replies, 10 likes


Nov 01 2019 MONTREAL.AI

Efficient Graph Generation with Graph Recurrent Attention Networks Liao et al.: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN #Graph #NeuralNetworks #NeurIPS #NeurIPS2019 https://t.co/zATrVR42d6
0 replies, 6 likes


Oct 03 2019 Johan Ugander

Interesting progress on circumventing challenges with node orderings in NNs processing graph data. "Relational pooling" at ICML19: https://arxiv.org/pdf/1903.02541.pdf and identifying useful "canonical node orderings" (including ordering by core number): https://arxiv.org/pdf/1910.00760.pdf
1 replies, 1 likes


Oct 03 2019 Brundage Bot

Efficient Graph Generation with Graph Recurrent Attention Networks. Renjie Liao, Yujia Li, Yang Song, Shenlong Wang, Charlie Nash, William L. Hamilton, David Duvenaud, Raquel Urtasun, and Richard S. Zemel http://arxiv.org/abs/1910.00760
1 replies, 0 likes


Content