CS 189/289A: Intro to Machine Learning

UC Berkeley, Spring 2026

Wheeler 150, Tuesdays and Thursdays 2pm-3:30pm

Jennifer Listgarten profile photo

Jennifer Listgarten

Course staff email: cs189-instructors@berkeley.edu. This email is monitored by the instructors, the head TAs, and a few lead TAs.

Welcome to Week 8 of CS 189/289A!

Schedule

Wk. Date Lecture Recommended Reading Sections HW
1 Tue
Jan 20
1. Introduction + ML problem framing
PDF
1.1 (The Impact of Deep Learning), 1.2 (A Tutorial Example), 1.3 (A Brief History of Machine Learning) No Section
Thu
Jan 22
2. Data Tools
PDF / Video
2 Tue
Jan 27
3. Machine Learning Mechanics - Terminology and Techniques
PDF / Video
1.1 (The Impact of Deep Learning), 1.2 (A Tutorial Example), 1.2.1 (Synthetic data), 1.2.2 (Linear models), 1.2.3 (Error function), 1.2.4 (Model complexity), 1.2.5 (Regularization), 1.2.6 (Model selection), 3.5.3 (Nearest-neighbours), 4.1 (Linear Regression), 4.1.1 (Basis functions), 5.4.3 (Logistic regression), 9.1 (Inductive Bias), 9.1.2 (No free lunch theorem) 1. Discussion

PDF Solutions Walkthrough

Homework 1
(due Fri Feb 20)

Part 1: Written / Part 1: Coding / Part 2: Coding
Thu
Jan 29
4. Clustering, Probability Review
PDF / Video
2.1 (The Rules of Probability), 2.1.2 (The sum and product rules), 2.1.3 (Bayes' theorem), 2.1.6 (Independent variables), 2.2 (Probability Densities), 15.1 (K-means Clustering)
3 Tue
Feb 03
5. Intro to Maximum Likelihood Estimation, Multivariate Gaussians, Mixture of Gaussians
PDF / Video
2.3.2 (Likelihood function), 3.1.3 (Multinomial distribution), 3.2 (Multivariate Gaussian), 3.2.1 (Geometry of Gaussian), 3.2.7 (Maximum likelihood), Appendix C (Lagrange multipliers), Optional: 2.2.2 (Expectations and covariances), 3.2.9 (Mixtures of Gaussians) 2. Discussion

PDF Solutions Walkthrough

Thu
Feb 05
6. Multivariate Gaussians & Mixture of Gaussians
PDF / Video
2.2.2 (Expectations and covariances), 3.1.3 (Multinomial distribution), 3.2 (Multivariate Gaussian), 3.2.1 (Geometry of Gaussian), 3.2.9 (Mixtures of Gaussians), 15.2 (Mixtures of Gaussians), 15.3 (Mixture model log-likelihood), Appendix C (Lagrange multipliers)
4 Tue
Feb 10
7. Mixture of Gaussians & Linear Regression
PDF / Video
1.2 (A Tutorial Example), 2.3.4 (Linear regression), 3.2.9 (Mixtures of Gaussians), 4.1.1 (Basis functions), 4.1.2 (Likelihood function), 4.1.3 (Maximum likelihood), 4.1.4 (Geometry of least squares), 15.2 (Mixtures of Gaussians), 15.2.1 (Likelihood function), Appendix A.3 (Matrix Derivatives) 3. Discussion

PDF Solutions Walkthrough

Thu
Feb 12
8. Linear Regression
PDF / Video
1.2.2 (Linear models), 1.2.3 (Error function), 1.2.5 (Regularization), 1.2.6 (Model selection), 2.1.3 (Bayes' theorem), 2.3.4 (Linear regression), 2.6.2 (Regularization), 4.1.1 (Basis functions), 4.1.2 (Likelihood function), 4.1.3 (Maximum likelihood), 4.1.4 (Geometry of least squares), 4.1.6 (Regularized least squares), 9.2 (Weight Decay), 9.2.2 (Generalized weight decay), Appendix A.3 (Matrix Derivatives)
5 Tue
Feb 17
9. Linear Regression & Regularization
PDF / Video
4. Discussion

PDF Solutions Walkthrough

Thu
Feb 19
10. Finish Linear Regression & Regularization
PDF / Video
1.2.6 (Model selection), 4.1.2 (Likelihood function), 4.1.6 (Regularized least squares), 5.3 (Generative Classifiers), 5.3.1 (Continuous inputs), 5.3.2 (Maximum likelihood solution), 5.3.3 (Discrete features), 9.2 (Weight Decay), 9.2.2 (Generalized weight decay)
6 Tue
Feb 24
11. Classification
PDF / Video
5.0 (Introduction), 5.1 (Discriminant Functions), 5.1.1 (Two classes), 5.1.2 (Multiple classes), 5.1.3 (1-of-K coding), 5.1.4 (Least squares for classification), 5.3 (Generative Classifiers), 5.3.1 (Continuous inputs), 5.3.2 (Maximum likelihood solution), 5.3.3 (Discrete features) 5. Discussion

PDF Solutions Walkthrough

Homework 2
(due Fri Mar 13, 11:59 PM PT)

Assignment
Thu
Feb 26
12. Logistic Regression, Classifier Accuracy
PDF / Video
5.2.2 (Expected loss), 5.2.5 (Classifier accuracy), 5.2.6 (ROC curve), 5.4 (Discriminative Classifiers), 5.4.1 (Activation functions), 5.4.2 (Fixed basis functions), 5.4.3 (Logistic regression), 5.4.4 (Multi-class logistic regression)
7 Tue
Mar 03
13. Conv. + Momentum + Adam + Stochastic Gradient Descent
Notes / Video
7.1 (Error Surfaces), 7.1.1 (Local quadratic approximation), 7.2 (Gradient Descent Optimization), 7.2.2 (Batch gradient descent), 7.2.3 (Stochastic gradient descent), 7.2.4 (Mini-batches), 7.3 (Convergence), 7.3.1 (Momentum), 7.3.2 (Learning rate schedule), 7.3.3 (RMSProp and Adam), Appendix A.4 (Eigenvectors) 6. Discussion

PDF Solutions Walkthrough

Thu
Mar 05
14. MLE, MAP and Bias-Variance Trade-off
Notes / Video
2.6.1 (Model parameters), 2.6.2 (Regularization), 3.1.1 (Bernoulli distribution), 4.1.2 (Likelihood function), 4.1.6 (Regularized least squares), 4.3 (The Bias-Variance Trade-off), 5.4.3 (Logistic regression)
8 Tue
Mar 10
15. Learning with Gradient Descent
Notes / Video
7.1 (Error Surfaces), 7.1.1 (Local quadratic approximation), 7.2.2 (Batch gradient descent), 7.2.3 (Stochastic gradient descent), 7.2.4 (Mini-batches), 7.3 (Convergence), 7.3.1 (Momentum), 7.3.2 (Learning rate schedule), 7.3.3 (RMSProp and Adam), Appendix A.4 (Eigenvectors) 7. Discussion

PDF Solutions Walkthrough

Thu
Mar 12
16. Entropy, Information and Logistic Regression
Notes / Video
2.5.1 (Entropy), 2.5.5 (Kullback-Leibler divergence), 5.3.1 (Continuous inputs), 5.4.3 (Logistic regression), 5.4.4 (Multi-class logistic regression)
9 Tue
Mar 17
17. Midterm (7pm-9pm)