STAT 339: Slides


Part I: Machine Learning Ideas

01: Introduction
02: K-nearest neighbors / Evaluation
03: Cross-Validation
04: Simple Linear Regression (Linear Regression I)
05: Multiple Linear Regression (Linear Regression II)
06: Regularized Regression (Linear Regression III)

Part II: Probability and Statistics / Models for Supervised Learning

07-08: Probability / Random Variables (Probability I and II)
09-10: Expectation / Random Vectors (Probability III and IV)
10-11: Max Likelihood and the Linear Model
12: Bias Variance Tradeoff in Supervised Learning
13: Bias and Variance in Linear Regression
14: Bayesian Inference I
15: Bayesian Inference II
16: Bayesian Inference III
17-18: Naive Bayes Classification
19: Bayesian Regression / Bayesian Occam’s Razor

Part III: Approximation Methods

20: Logistic Regression With Newton-Raphson (Approximate Inference I)
21: Laplace Approximation (Approximate Inference II)
22: Sampling (Approx. III)

Part IV: Models with Latent Variables (Unsupervised Learning)

23: K-Means (Clustering I)
24: Mixture of Gaussians and EM (Clustering II)
25: Theory Behind EM (Clustering III)
26: Bayesian Clustering / Gibbs Sampling (Clustering IV)
27-28: Markov Chain Monte Carlo
29: Belief Networks / Graphical Models
30: Belief Networks / Graphical Models II
31: Forward-Backward Algorithm (Hidden Markov Models I)
32: EM for HMMs (HMMs II)
33: Gibbs Sampling for HMMs (HMMs III)

Part V: Bayesian Nonparametric Models

34-35: Gaussian Process Regression
36: Gaussian Process Classification
37-38: Infinite Mixture Model (Nonparametric Clustering)