You May Also Enjoy
4 minute read
Let’s take stock of where we are on our journey to add ODE capabilities on PyMC3.
1 minute read
OK, I will keep this one short and sweet, no math. We have a sampling notebook.
7 minute read
A little while ago, I wrote a post on doing gradient descent for ODEs. In that post, I used
autograd to do the automatic differentiation. While neat, it was really a way for me to get familiar with some math that I was to use for GSoC. After taking some time to learn more about
theano, I’ve reimplemented the blog post, this time using
theano to perform the automatic differentiation. If you’re read the previous post, then skip right to the code.
6 minute read
Gradient descent usually isn’t used to fit Ordinary Differential Equations (ODEs) to data (at least, that isn’t how the Applied Mathematics departments to which I have been a part have done it). Nevertheless, that doesn’t mean that it can’t be done. For some of my recent GSoC work, I’ve been investigating how to compute gradients of solutions to ODEs without access to the solution’s analytical form. In this blog post, I describe how these gradients can be computed and how they can be used to fit ODEs to synchronous data with gradient descent.