Kirill Zubov / Aug 20 2019
Remix of

# JSoC 2019: Simple ODE solver using neural network

## Introduction

Hi, I'm Kirill Zubov.

I have been accepted to "Julia season of contribution". I will spend the following months developing a new package for Solving high-dimensional partial differential equations (PDEs) using neural networks.

The primary goal of this project is implement package for solving high dimensional Partial differential equations (PDEs) using neural network in Julia. PDEs have been widely used. Solving PDEs using traditional methods in high dimensions suffers from the curse of dimensionality. The newly emerging deep learning techniques are promising in resolving this problem because of its success in many high dimensional problems. The main idea of method the PDEs are reformulated using backward stochastic differential equations (BSDEs) and the gradient of the unknown solution is approximated by neural networks. New methods for solving partial differential equations by neural networks is the key to the numerical solution tasks of high dimension. In particular, such important the nonlinear PDEs as Black–Scholes equation, the Hamilton–Jacobi–Bellman equation, the Allen–Cahn equation, and the Schrödinger equation. The implementation of these methods and interfaces for Julia is an important and actual task that will give easy for use tools solve high-dimensional PDEs for students, engineers, scientists.

My first task was implement simple ode solver using ideas from the research paper "Artificial Neural Networks for Solving Ordinary and Partial Differential EquationsI. E. Lagaris, A. Likas, D. I. Fotiadis"[1]

## Simple ODE solver using neural network

I have written simple Ode solver using neural network approximation for the equation of like the form (fig.1):

We get our approximate solution in the form of trial solution (Fig. 2). Where N is neural network. At every step of the training we will receive an increasingly accurate solution.

The neural network tunes its weights using the algorithm on the backpropagated gradients from the loss function (Fig. 3). Calculating the loss function at each step, we update the weights of our model of the neural network and thereby train it.

## Result

First we need to install all the necessary packages.

using Pkg
Pkg.add("Plots")
using ForwardDiff
using Plots

Below you can find the source code of prototype I've been working on, and the example on how it run.

0.9s
function generate_data(low, high, N)
x = range(low, stop = high, length = N) |> collect
return x
end

function init_params(hl_width)
P = Array{Any}(undef, 3)
P[1] = randn(hl_width,1)
P[2] = zeros(hl_width,1)
P[3] = randn(1,hl_width)
return P
end

sigm(x) =  1 ./ (1 .+ exp.(.-x))

function predict(w, b, v, x)
h = sigm(w*x .+ b)
return v*h
end

phi(w, b, v,x) = u0 .+ x*predict(w, b, v, x)
dydx(w, b, v, x) = ForwardDiff.derivative(x -> phi(w, b, v, x)[1], x)
loss(w, b, v, x) = sum(abs2, dydx(w, b, v, x) - f(x))

loss∇w(w, b, v, x) = ForwardDiff.gradient(w -> loss(w, b, v, x), w)
loss∇b(w, b, v, x) = ForwardDiff.gradient(b -> loss(w, b, v, x), b)
loss∇v(w, b, v, x) = ForwardDiff.gradient(v -> loss(w, b, v, x), v)

function test(P,x)
sumloss = numloss = 0
for i in x
sumloss += loss(P[1],P[2],P[3],i)
numloss += 1
end
return sumloss/numloss
end

function train(P, x, epochCount, lr)
for i = 1:epochCount
for j in x
P[1] -= lr * loss∇w(P[1], P[2], P[3], j)
P[2] -= lr * loss∇b(P[1], P[2], P[3], j)
P[3] -= lr * loss∇v(P[1], P[2], P[3], j)
end
if test(P,x) < 10^-8
break
end
end
return P
end

#ODE solver using neural network approximation
function ODENetSolver(f, N, u0, x0, xn, epochCount, lr)
x =  generate_data(x0,xn,N)
P = init_params(weightCount)
P = train(P, x, epochCount, lr)
u = [phi(P[1],P[2],P[3],i)[1] for i in x]
return (x,u)
end
ODENetSolver (generic function with 1 method)
N = 20
weightCount = 5
u0 = 0
x0 = 0
xn = 1
epochCount = 100
lr = 0.1
f(x) = cos(2pi*x)

solve = ODENetSolver(f, N, u0, x0, xn, epochCount, lr)

f2(x) =  sin(2pi*x) / (2*pi)
x = solve[1]
analyticSolve = (x, f2.(x))
plot(solve, label="y")
plot!(analyticSolve, label ="true y", fmt = :png)
xlabel!("Solution for y'= cos(2pi*x)")

## Conclusion

At this moment, I'm rewriting old code that it is compatible with Julia v1.0 and also rewrite the algorithm back-propagation use Flux library.