STAT 339 (Spring 2020): Slides

Slides

01 (2/03): Introduction
02 (2/05): Classification / Evaluating a Classifier
03 (2/07): Cross Validation
04 (2/10): Linear Regression I: One Predictor
05-06 (2/12-2/14): Linear Regression II: Multiple Predictors
07-08 (2/17-2/19): Linear Regression III: Polynomials and Regularization
09-10 (2/21-2/24): Probability I: Probability Spaces / Random Variables
11-12 (2/26-2/28): Probability II: Discrete Random Variables / Joint and Conditional Probability
13-14 (3/02-3/04): Probability III: Continuous Random Variables / Expectation and Variance (UPDATED 3/5)
15 (3/06): Maximum Likelihood Estimation
16 (3/09): Bias-Variance Tradeoff
17 (3/11): Bias and Variance in Regression
18 (3/30): Bayesian Inference I: Fundamentals
19 (4/01): Bayesian Inference II: Conjugate Priors
20 (4/03): Bayesian Inference III: Conjugate Priors and Exponential Families
21 (4/06): Belief Networks
22 (4/08): Bayesian Inference IV: Prediction and Model Selection

Archive of Slides from Spring 2017

Part I: Machine Learning Ideas

01: Introduction
02: K-nearest neighbors / Evaluation
03: Cross-Validation
04: Simple Linear Regression (Linear Regression I)
05: Multiple Linear Regression (Linear Regression II)
06: Regularized Regression (Linear Regression III)

Part II: Probability and Statistics / Models for Supervised Learning

07-08: Probability / Random Variables (Probability I and II)
09-10: Expectation / Random Vectors (Probability III and IV)
10-11: Max Likelihood and the Linear Model
12: Bias Variance Tradeoff in Supervised Learning
13: Bias and Variance in Linear Regression
14: Bayesian Inference I
15: Bayesian Inference II
16: Bayesian Inference III
17-18: Naive Bayes Classification
19: Bayesian Regression / Bayesian Occam’s Razor

Part III: Approximation Methods

20: Logistic Regression With Newton-Raphson (Approximate Inference I)
21: Laplace Approximation (Approximate Inference II)
22: Sampling (Approx. III)

Part IV: Models with Latent Variables (Unsupervised Learning)

23: K-Means (Clustering I)
24: Mixture of Gaussians and EM (Clustering II)
25: Theory Behind EM (Clustering III)
26: Bayesian Clustering / Gibbs Sampling (Clustering IV)
27-28: Markov Chain Monte Carlo
29: Belief Networks / Graphical Models
30: Belief Networks / Graphical Models II
31: Forward-Backward Algorithm (Hidden Markov Models I)
32: EM for HMMs (HMMs II)
33: Gibbs Sampling for HMMs (HMMs III)

Part V: Bayesian Nonparametric Models

34-35: Gaussian Process Regression
36: Gaussian Process Classification
37-38: Infinite Mixture Model (Nonparametric Clustering)