Please use this identifier to cite or link to this item:
http://theses.ncl.ac.uk/jspui/handle/10443/5886
Title: | Computational methods for automatic Bayesian inference |
Authors: | Fisher, Matthew Alexander |
Issue Date: | 2022 |
Publisher: | Newcastle University |
Abstract: | Bayesian inference provides a principled approach for combining prior knowledge with mathematical models and observational data. Driven in part by the advent of modern computing technologies, Bayesian methodology is now widely used in diverse fields, facil itating scientific analyses, applications of machine learning and artificial intelligence, and even as the basis of probabilistic methods for numerical computation. The computational cost associated with Bayesian methodology is still considered to be high, however, with key challenges including the numerical approximation of posterior distributions (Green et al., 2015) and experimental design (Kleinegesse & Gutmann, 2019). In many cases, the computational cost of Bayesian inference places a practical limit on the phenomena that can be modelled. This thesis attempts to address these challenges in three main contributions: Firstly, we consider the application of Bayesian methods to numerical integration, where Bayesian cubature methods have generated interest but have been criticised for their lack of adaptivity relative to classical cubature methods. By developing non-stationary, adaptive approaches to Bayesian cubature, together with appropriate computational method ology, we demonstrate how improved cubature performance can be achieved relative to existing Bayesian cubature approaches. The insight and methodology developed extends beyond cubature to other numerical tasks, such as the solution of differential equations, where adaptation is known to be crucial. We also consider directly interpreting a classical non-probabilistic cubature method through the Bayesian lens and explore the theoretical ramifications. Secondly, in an attempt to automate aspects of the previous methodology we present GaussED, a simple probabilistic programming language coupled to a powerful experimental design engine. Together, these components allow a user to automate sequential experi mental design for approximating a (possibly nonlinear) quantity of interest in Gaussian processes models. With a few lines of code, a user can quickly emulate a computer model, perform Bayesian cubature, solve linear partial differential equations, perform tomographic reconstruction from integral data and perform Bayesian optimisation with gradient data, all with sequential adaptation automatically built-in. This is achieved by formulating ex perimental design in a decision theoretic framework where, instead of specifying a bespoke acquisition function, the user specifies a loss function relevant to their quantity of interest. Finally, we explore methods of extending the aforementioned methodology to the non Gaussian and, consequently, the non-conjugate case. The methods we explore are a vari ation of a modern approach to variational inference, called measure transport. Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback–Leibler divergence (KLD) from the posterior to the approximation. In this part, we propose to minimise a kernel Stein discrepancy (KSD) instead. The consistency of the associated posterior approximation is established and empirical results suggest that KSD is competitive and more flexible alternative to KLD for measure transport. |
Description: | PhD Thesis |
URI: | http://hdl.handle.net/10443/5886 |
Appears in Collections: | School of Mathematics, Statistics and Physics |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FisherMA2022.pdf | Thesis | 8.79 MB | Adobe PDF | View/Open |
dspacelicence.pdf | Licence | 43.82 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.