Papers of the day   All papers

Scalable Gradients for Stochastic Differential Equations

Comments

Jan 09 2020 David Duvenaud

Training Neural SDEs: We worked out how to do scalable reverse-mode autodiff for stochastic differential equations. This lets us fit SDEs defined by neural nets with black-box adaptive higher-order solvers. https://arxiv.org/pdf/2001.01328.pdf With @lxuechen, @rtqichen and @wongtkleonard. https://t.co/qlUwMxezjO
6 replies, 856 likes


Jan 09 2020 Daisuke Okanohara

The gradient of the solutions of stochastic differential equations (SDEs) can be efficiently computed by a stochastic adjoint sensitivity method as NeuralODE, which requires constant-memory and scalable vector-Jacobian product only. https://arxiv.org/abs/2001.01328
0 replies, 15 likes


Jan 09 2020 Paul Portesi ن​

https://twitter.com/DavidDuvenaud/status/1215347970159382534?s=20
0 replies, 2 likes


Jan 07 2020 Piotr Sokol

Scalable Gradients for Stochastic Differential Equations. (arXiv:2001.01328v1 [cs.LG]) http://arxiv.org/abs/2001.01328
0 replies, 1 likes


Content