Previously, we focused on general methods for approximating posterior distributions, such as MCMC and Variational Inference. Now, we turn our attention to Gaussian Processes (GPs), a powerful framework within Bayesian non-parametric modeling. Unlike models defined by a fixed number of parameters, GPs allow the model complexity to adapt to the data by placing priors directly over functions.
In this chapter, you will learn:
We will explore the theory behind these concepts and apply them through hands-on coding exercises.
4.1 Bayesian Non-parametric Modeling Introduction
4.2 Defining Gaussian Processes: Priors Over Functions
4.3 Covariance Functions (Kernels): Properties and Selection
4.4 Gaussian Process Regression Formulation
4.5 Hyperparameter Marginal Likelihood Optimization
4.6 Gaussian Process Classification Techniques
4.7 Approximations for Scalable Gaussian Processes
4.8 Hands-on Practical: Gaussian Process Modeling
© 2025 ApX Machine Learning