Probabilistic Machine Learning Basics
Sep 23, 2024
ยท
1 min read
This note contains part of classical models in probabilistic machine learning, where most of the contents are from Pattern Recognition and Machine Learning and lectures in Cambridge MLMI 23-24.
However, it might not be that friendly for ML beginners. A preliminary of basic machine learning knowledge, \textit{e.g.} linear regression, logistic regression, Bayesian inference, MLE and MAP, etc., is recommended. And I apologize that possible typos might occur in this note, and I will fix them smoothly as long as I have time to do so.
This note is outlined as follows:
- Preliminary: basic distribution and maths teniques might used
- Linear Models for Regression: a review in both the decision-making perspective and the statistical one
- Linear Models for Classification:
- Generative models: Fisher’s Discriminant model
- Discriminative models: Logistic regression, Iterative Reweighted Least Square, Multiclass Logistic regression, Probit regression and Bayesian Logistic regression
- Kernal methods: kernels, RKHS and Kernel regression
- Gaussian Process: GP regression, GP classification and large-scale kernal approximation
- Kernal Machines: SVM and RVM
- Graphical Models: Bayesian Network and Markov Random Field
- Expectation-Maximization: a review in both the approximate inference perspective and the KL-divergence one
The full note is provided in