David Duvenaud: Neural ODEs are slow. We speed them up by regularizing their higher derivatives, learning ODEs that are easy to solve:
with @jacobjinkelly @jessebett @SingularMattrix https://t.co/yy6K2JgYD9
7 replies, 645 likes
Simone Scardapane: *Learning differential equations that are easy to solve*
by @DavidDuvenaud @jacobjinkelly @jessebett @SingularMattrix
Regularizing higher derivatives in a neural ODE makes it faster to solve.
Implementation extends #JAX for Taylor-mode AD (links below).
1 replies, 38 likes
Vector Institute: "Neural ODEs are slow," says Faculty Member @DavidDuvenaud. But in the new paper "Learning Differential Equations that are Easy to Solve," he and his co-authors "speed them up by regularizing their higher derivatives, learning ODEs that are easy to solve"
0 replies, 29 likes
arxiv: Learning Differential Equations that are Easy to Solve. http://arxiv.org/abs/2007.04504 https://t.co/pGTDfC5ITq
1 replies, 18 likes
Jesse Bettencourt: Today at #JuliaCon I will be giving a poster about higher order automatic differentiation with Taylor mode, our implementation in JAX for our recent work to learn neural ODEs that are easy to solve!
Come chat about AD and how to call JAX from Julia ;)
1 replies, 15 likes
Daisuke Okanohara: Neural ODE tends to learn unnecessarily complex dynamics and slow down the ODE solver. To solve this they regularize 1) the speed and the norm of Jacobian https://arxiv.org/abs/2002.02798 2) the higher-order derivatives estimated by Tayler-mode AD https://arxiv.org/abs/2007.04504
0 replies, 11 likes
Jacob Kelly: Excited to be sharing my first paper! It was such a journey and I can't thank @jessebett @SingularMattrix @DavidDuvenaud enough for being such supportive and overall amazing collaborators 😄
1 replies, 9 likes
HotComputerScience: Most popular computer science paper of the day:
"Learning differential equations that are easy to solve"
0 replies, 1 likes
Found on Jul 17 2020 at https://arxiv.org/pdf/2007.04504.pdf