Seminar:
Spring 2011, Thursdays, Milner 216, 4:00-4:50 PM
Date: April 7
Kainan Wang
Abstract
Inverse problems are of great chances ill-posed--the solution can either be non-unique or unstable. Even for well-posed problems, the numerical approximations can be ill-conditioned, i.e. they are vulnerable to perturbations. In real world, measurements are always mixed with perturbations, or, 'noises'. Solving ill-posed problems based on 'noisy' data is thus sometimes very difficult. Mathematicians have developed brilliant techniques dealing with various such problems. One of these ideas is to use the help of statistics, which sometimes is named Bayesian Inversion. Instead of giving a single estimate, the statistical way focuses more on estimating a probability distribution on the parameters. In this talk, I will try to explain some general principles behind the statistical inversion method. The concentration will be on a Markov Chain Monte Carlo (MCMC) sampling method(s) and a simple example will be given to illustrate the use of the technique. Not much preliminary knowledge except for any first-half-semester undergraduate probability course material is needed. A 3 minute quick wiki-review of Bayes' formula will help understand the talk.