Weekly Seminar
On Neural Differential Equations
Patrick Kidger (Oxford University (U.K.))
Thu, 14 Oct 2021 • 10:30-11:30h • 10:15 ONLINE coffee, 10:30 talk (ZOOM link will be published via newsletter)

Abstract

Neural Differential Equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin. Traditional parameterised differential equations are a special case. Many popular neural network architectures (e.g. residual networks, recurrent networks, StyleGAN2, coupling layers) are discretisations. By treating differential equations as a learnt component of a differentiable computation graph, then NDEs extend current physical modelling techniques whilst integrating tightly with current deep learning practice.

NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides. They are particularly suitable for tackling dynamical systems, time series problems, and generative problems.

This talk will offer a dedicated introduction to the topic, with examples including neural ordinary differential equations (e.g. to model unknown physics), neural controlled differential equations (“continuous recurrent networks”; e.g. to model functions of time series), and neural stochastic differential equations (e.g. to model time series themselves). If time allows I will discuss other recent work, such as novel numerical neural differential equation solvers. This talk includes joint work with Ricky T. Q. Chen, Xuechen Li, James Foster, and James Morrill.