Using neural networks for approximating functions and equations

In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the n...

Full description

Saved in:
Bibliographic Details
Main Author: Li, Yongming
Other Authors: Hoang Viet Ha
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2019
Subjects:
Online Access:https://hdl.handle.net/10356/136490
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the number of non-zero weights, of the ReLU neural network approximations for the problem solutions. In Chapter 2, we begin with the recent results on neural network approximation theory, and operations used to construct neural networks. In Chapter 3, we employ the sparse tensor product interpolation method to construct ReLU neural networks for approximating solutions of the two-scale homogenized elliptic equations, with essential network size for a prescribed accuracy. The numerical experiments illustrate the theoretical results on how to solve the elliptic problems in Chapter 2 and Chapter 3. In Chapter 4, we assume that the random coefficients have an infinite affine representation for the stochastic parabolic problem and we reduce the problem into an infinite parametric problem. We express the parametric solution as a Taylor generalized polynomial chaos (gpc) expansion and we perform an adaptive discretization on both the spatial-temporal and parameter domains. Using this optimized discretization, we show that for a prescribed accuracy, there is a ReLU neural network for the parametric solution with essentially optimal network complexities. Lastly, in Chapter 5, we consider the parametric elliptic problems, where the random coefficients depend on the parameters in a Lipchitz manner (a weaker assumption than the problem of Chapter 4). We employ the hierarchical finite element method to construct the ReLU neural networks for approximating the solutions to the parametric problem. Our work illustrates the expressive power and approximation capabilities of deep neural networks to approximate functions and solutions to PDE problems.