Please use this identifier to cite or link to this item:
|Title:||Numerical analysis of some Bayesian inverse problems||Authors:||Quek, Jia Hao||Keywords:||DRNTU::Science::Mathematics||Issue Date:||31-Dec-2018||Source:||Quek, J. H. (2018). Numerical analysis of some Bayesian inverse problems. Doctoral thesis, Nanyang Technological University, Singapore.||Abstract:||Bayesian inverse problems for partial differential equations arise from many important real world applications. We find the physical properties of a medium in the form of an unknown coefficient of a partial differential equation, given limited noisy observations on the solution. The noise follows a known probability distribution. The unknown coefficient, or the parameters on which it depends, belong to a prior probability space. We aim to find the posterior probability which is the conditional probability of the unknown given the observations. The first part of the thesis is devoted to Bayesian inverse problems to find the unknown locally periodic coefficient of a two scale elliptic equation. In the remaining part, we develop the Multilevel Markov Chain Monte Carlo method for approximating posterior expectation of a quantity of interest for Bayesian inverse problems for partial differential equations. The method achieves an approximation within a prescribed accuracy but uses only an optimal number of total degrees of freedom. We develop the method for elliptic forward equations with the Gaussian prior distribution for the log-normal coefficients, and for parabolic equations with both uniform and Gaussian prior distributions for the coefficients. In Chapter 2, we consider the problem of finding coefficients of locally periodic two scale elliptic problems. We consider both the uniform prior and the Gaussian prior in the space of locally periodic coefficients. In the first case, the coefficient is uniformly coercive and bounded for all the realizations. In the second case, the coefficient is positive but can get arbitrarily large and arbitrarily close to $0$. We approximate the posterior by the probability measure obtained from solution of the two scale homogenized equation. We show an explicit error for this approximation with respect to the Hellinger distance of the two measures, in terms of the microscopic scale. This error estimate holds when the solution of the two scale homogenized equation is sufficiently regular. The two scale homogenized equation provides all the information we need: the solution to the homogenized equation which approximates the solution to the two scale forward equation macroscopically, and the corrector term which encodes the microscopic behavior. Although this equation is posed in a high dimensional tensorized domain, the sparse tensor product finite element method developed in V. H. Hoang and Ch. Schwab, Multiscale Model. Simul. Vol 3, pp 168-194 (2005), solves this equation with essentially optimal complexity. We then approximate the posterior measure by the measure obtained from the sparse tensor product finite element solution of the two scale homogenized equation, with an explicit error in terms of the finite element mesh and the microscopic scale. We show numerically that observations on the macroscopic behavior alone are not sufficient to infer the microstructure. We need also observations on the corrector. Solving the two scale homogenized equation, we get both the solution to the homogenized equation and the corrector. Thus our method is particularly suitable for sampling the posterior measure of two scale coefficients. Chapter 3 reviews the Multilevel Markov Chain Monte Carlo method for approximating posterior expectation of a quantity of interest in Bayesian inverse problems for forward elliptic equations with uniform prior probability. This chapter serves as a basis for the development in the subsequent chapters. We define the uniform prior probability space. We then outline the Multilevel Markov Chain Monte Carlo method developed in V. H. Hoang, Ch. Schwab, A. M. Stuart, Inverse Problems, Vol. 29, 085010, 37pp (2013). The method achieves an approximation for the posterior expectation of a quantity of interest within a prescribed accuracy, using only an optimal level of degrees of freedom (with a possible logarithmic factor). We show that the logarithmic factor in the error can be reduced by slightly increasing the Markov Chain Monte Carlo sample size. The paper by Hoang et al. does not include numerical examples. We perform new numerical examples to illustrate the theory. The Multilevel Markov Chain Monte Carlo method by Hoang et al. is only valid for the uniform prior probability. Their proof of convergence is not valid for the Gaussian prior probability of the coefficient of a forward elliptic equation as it relies essentially on the uniform boundedness of the solution. In Chapter 4, we develop a new Multilevel Markov Chain Monte Carlo method for elliptic equation with Gaussian prior probability. We show theoretically the convergence of the method together with the explicit optimal convergence rate. We provide numerical examples using both independence sampler and preconditioned Crank-Nicolson (pCN) sampler to verify the rigorously justified theoretical convergence rate. In Chapter 5, we consider Bayesian inverse problems for finding coefficient of forward parabolic equations with the uniform prior probability. We consider the coefficient of the parabolic equation to be an expansion of random variables each uniformly distributed in a compact interval. We assume that the coefficient is uniformly coercive and bounded for all the realizations. We approximate the posterior measure by the approximated solution of the forward parabolic equation which is solved by backward Euler and finite element method with coefficient taking only a finite number of terms in the expansion. We then develop the Multilevel Markov Chain Monte Carlo method to estimate the posterior expectation of a quantity of interest. As in the paper by Hoang, Schwab and Stuart, the key point to achieve optimal convergence is to judiciously balance the level of resolution and the sample size of each run of the Markov Chain Monte Carlo algorithm. We establish rigorously the convergence rate. We present numerical examples that support the theory. We consider Gaussian prior probability where the coefficient of the parabolic equation takes the log-normal form in Chapter 6. We use the backward Euler finite element method to approximate the truncated forward parabolic equation taking only a finite number of terms in the coefficient. This approximation leads to an approximation for the posterior measure where the forward functional is determined from the approximated solution of the forward equation. We then approximate the posterior expectation of a quantity of interest by this approximated posterior measure. We develop the Multilevel Markov Chain Monte Carlo to estimate the posterior expectation in this case of forward parabolic equations with log-normal coefficients. We show theoretically the convergence of the method together with the explicit optimal convergence rate. We present numerical examples for both the independent sampler and the preconditioned Crank-Nicolson sampler. The results for both cases verify the theoretical prediction.||URI:||https://hdl.handle.net/10356/82985
|DOI:||https://doi.org/10.32657/10220/47555||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SPMS Theses|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.