Licentiate thesis 2023-002

Integrating Prior Knowledge into Machine Learning Models with Applications in Physics

Philipp Pilar

20 September 2023

Abstract:

At the extremes, two antithetical approaches to describing natural processes exist. Theoretical models can be derived from first principles, allowing for clear interpretability; on the downside, this approach may be infeasible or inefficient for complex systems. Alternatively, methods from statistical machine learning can be employed to learn black box models from large amounts of data, while providing little or no understanding of their inner workings.

Both approaches have different desirable properties and weaknesses. It is natural to ask how they may be combined to create better models. This is the question that the field of physics-informed machine learning is concerned with, and which we will consider in this thesis. More precisely, we investigate ways of integrating additional prior knowledge into machine learning models.

In Paper I, we consider multitask Gaussian processes and devise a way to include so-called sum constraints into the model, where a nonlinear sum of the outputs is required to equal a known value. In Paper II, we consider the task of determining unknown parameters from data when solving partial differential equations (PDEs) with physics-informed neural networks. Given the prior knowledge that the measurement noise is homogeneous but otherwise unknown, we demonstrate that it is possible to learn the solution and parameters of the PDE jointly with the noise distribution. In Paper III, we consider generative adversarial networks, which may produce realistic-looking samples but fail to reproduce their true distribution. In our work, we mitigate this issue by matching the true and generated distributions of statistics extracted from the data.

Available as PDF (8.73 MB)

Download BibTeX entry.