Papers of the day   All papers

Scalable Gradients for Stochastic Differential Equations

Comments

David Duvenaud: Training Neural SDEs: We worked out how to do scalable reverse-mode autodiff for stochastic differential equations. This lets us fit SDEs defined by neural nets with black-box adaptive higher-order solvers. https://arxiv.org/pdf/2001.01328.pdf With @lxuechen, @rtqichen and @wongtkleonard. https://t.co/qlUwMxezjO

6 replies, 863 likes


David Duvenaud: Includes stochastic variational inference for fitting latent SDE time series models. Uses virtual Brownian trees for constant memory cost. This adds overhead, but scales to large state spaces and dynamics models. Paper at https://arxiv.org/abs/2001.01328

1 replies, 51 likes


Daisuke Okanohara: The gradient of the solutions of stochastic differential equations (SDEs) can be efficiently computed by a stochastic adjoint sensitivity method as NeuralODE, which requires constant-memory and scalable vector-Jacobian product only. https://arxiv.org/abs/2001.01328

0 replies, 15 likes


Paul Portesi ن​: https://twitter.com/DavidDuvenaud/status/1215347970159382534?s=20

0 replies, 2 likes


注目の最新arXiv【毎日更新】: 2020/01/05 投稿 1位 LG(Machine Learning) Scalable Gradients for Stochastic Differential Equations https://arxiv.org/abs/2001.01328 7 Tweets 24 Retweets 137 Favorites

0 replies, 1 likes


Piotr Sokol: Scalable Gradients for Stochastic Differential Equations. (arXiv:2001.01328v1 [cs.LG]) http://arxiv.org/abs/2001.01328

0 replies, 1 likes


Content

Found on Jan 09 2020 at https://arxiv.org/pdf/2001.01328.pdf

PDF content of a computer science paper: Scalable Gradients for Stochastic Differential Equations