Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/136490
Title: Using neural networks for approximating functions and equations
Authors: Li, Yongming
Keywords: Science::Mathematics::Analysis
Science::Mathematics::Applied mathematics::Numerical analysis
Issue Date: 2019
Publisher: Nanyang Technological University
Abstract: In this report, we develop the approximation rates of ReLU neural networks for solutions to the elliptic two-scale problems, the stochastic parabolic initial boundary value problems, and the parametric elliptic problems. We obtain bounds on network complexities - in terms of the depth size and the number of non-zero weights, of the ReLU neural network approximations for the problem solutions. In Chapter 2, we begin with the recent results on neural network approximation theory, and operations used to construct neural networks. In Chapter 3, we employ the sparse tensor product interpolation method to construct ReLU neural networks for approximating solutions of the two-scale homogenized elliptic equations, with essential network size for a prescribed accuracy. The numerical experiments illustrate the theoretical results on how to solve the elliptic problems in Chapter 2 and Chapter 3. In Chapter 4, we assume that the random coefficients have an infinite affine representation for the stochastic parabolic problem and we reduce the problem into an infinite parametric problem. We express the parametric solution as a Taylor generalized polynomial chaos (gpc) expansion and we perform an adaptive discretization on both the spatial-temporal and parameter domains. Using this optimized discretization, we show that for a prescribed accuracy, there is a ReLU neural network for the parametric solution with essentially optimal network complexities. Lastly, in Chapter 5, we consider the parametric elliptic problems, where the random coefficients depend on the parameters in a Lipchitz manner (a weaker assumption than the problem of Chapter 4). We employ the hierarchical finite element method to construct the ReLU neural networks for approximating the solutions to the parametric problem. Our work illustrates the expressive power and approximation capabilities of deep neural networks to approximate functions and solutions to PDE problems.
URI: https://hdl.handle.net/10356/136490
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SPMS Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Using neural networks for approximating functions and equations.pdf
  Restricted Access
2.33 MBAdobe PDFView/Open

Page view(s)

261
Updated on Jun 25, 2022

Download(s) 50

80
Updated on Jun 25, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.