Papers of the day   All papers

Efficient Graph Generation with Graph Recurrent Attention Networks

Comments

David Duvenaud: I heard you like graphs, so we put a graph neural net in your graph generative model, so you can be invariant to order while you add edges to your graph. Scales to 5000 nodes. Paper: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN https://t.co/FFyuYqjy3q

3 replies, 643 likes


Renjie Liao: Happy to share our #NeurIPS2019 paper on generating graphs (~5K nodes) with graph recurrent attention networks (GRAN). It scales much better and achieves SOTA performance and very impressive sample-quality. https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN https://t.co/jIFWvGlgqn

1 replies, 117 likes


Yujia Li: New paper at #NeurIPS2019 on generative models of graphs. We explored many quality-efficiency trade-offs in this work and came up with a new model that gets good graph generation quality with much better efficiency. Paper: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN

1 replies, 112 likes


Charlie Nash: Sampling efficiency is a pain point for autoregressive models. GRAN is a generative model of graphs that enables faster sampling while achieving great performance. Impressive looking samples!

0 replies, 24 likes


Connected Data LDN: Efficient Graph Generation with Graph Recurrent Attention Networks. #BigData #Analytics #DataScience #AI #MachineLearning #IoT #IIoT #PyTorch #Python #RStats #TensorFlow #Java #CloudComputing #Serverless #DataScience via @gp_pulipaka https://arxiv.org/pdf/1910.00760.pdf https://t.co/RG4vfPywQQ

0 replies, 10 likes


MONTREAL.AI: Efficient Graph Generation with Graph Recurrent Attention Networks Liao et al.: https://arxiv.org/abs/1910.00760 Code: https://github.com/lrjconan/GRAN #Graph #NeuralNetworks #NeurIPS #NeurIPS2019 https://t.co/zATrVR42d6

0 replies, 9 likes


Johan Ugander: Interesting progress on circumventing challenges with node orderings in NNs processing graph data. "Relational pooling" at ICML19: https://arxiv.org/pdf/1903.02541.pdf and identifying useful "canonical node orderings" (including ordering by core number): https://arxiv.org/pdf/1910.00760.pdf

1 replies, 1 likes


Brundage Bot: Efficient Graph Generation with Graph Recurrent Attention Networks. Renjie Liao, Yujia Li, Yang Song, Shenlong Wang, Charlie Nash, William L. Hamilton, David Duvenaud, Raquel Urtasun, and Richard S. Zemel http://arxiv.org/abs/1910.00760

1 replies, 0 likes


Content

Found on Oct 03 2019 at https://arxiv.org/pdf/1910.00760.pdf

PDF content of a computer science paper: Efficient Graph Generation with Graph Recurrent Attention Networks