Skip to content
Module I: Preliminaries of Nonparametric Regression
[-]
Collapse All[-]
- Introduction: course overview; example tasks
- Optimal Predictions and Measures of Accuracy: loss functions; predictive risk; bias-variance trade-off
- Linear Smoothers: definition; basic examples
- A First Look at Shrinkage Methods: ridge regression; lasso
- Choosing the Smoothing Parameter: analytic approaches; cross validation
Lectures:
- 1. Apr 2: Intro. Optimal predictions, predictive performance, bias-variance tradeoff.
[Intro, predictions, bias-variance tradeoff slides] [Intro, predictions, bias-variance tradeoff annotated slides]
- 2. Apr 4: Linear smoothers, ridge regression, LASSO.
[Linear smoothers, ridge regression, LASSO slides] [Linear smoothers, ridge regression, LASSO annotated slides]
- 3. Apr 9: LASSO cont'd.
[LASSO cont'd slides] [LASSO cont'd annotated slides]
Module II: Splines and Kernel Methods
[-]
Collapse All[-]
- Introduction: brief overview
- Spline Methods: piecewise polynomials; natural cubic splines; smoothing splines; B-splines; penalized regression splines
- Kernel Methods: kernel density estimation; the Nadaraya-Watson kernel estimator; local polynomial regression
- Inference for Linear Smoothers: variance estimation; confidence bands
- Spline and Kernel Methods for GLMs: extensions of spline and kernel methods to binomial, Poisson, gamma, etc, data
Lectures:
- 4. Apr 11: Smoothing parameters, spline intro
[Smoothing parameters, spline slides][Smoothing parameters, spline annotated slides]
- 5. Apr 16: B-splines, penalized regression splines, kernel methods intro
[B-splines, penalized regression splines, kernel methods intro slides][B-splines, penalized regression splines, kernel methods intro annotated slides]
- 6. Apr 18: Local polynomial regression, KDE, confidence bands
[Local polynomial regression, KDE, confidence bands slides][Local polynomial regression, KDE, confidence bands annotated slides]
- BONUS. May 7: Nonparametrics for GLMs.
[Nonparametrics for GLMs slides][Nonparametrics for GLMs annotated slides]
Module III: Bayesian Nonparametrics
[-]
Collapse All[-]
- Introduction: principles of Bayesian nonparametrics
- Regression via Gaussian processes
- Density estimation via Dirichlet process mixture of Gaussians
Lectures:
Module IV: Nonparametrics with Multiple Predictors
[-]
Collapse All[-]
- Introduction: issues when considering multiple predictors
- Generalized Additive Models: GAMs; the backfitting algorithm
- Spline Methods in Several Variables: natural thin plate splines; thin plate regression splines; tensor product splines
- Kernel Methods in Several Variables: extending kernel methods to multidimensional covariates
- Smoothing Parameter Estimation: how to choose level of smoothing in more than one dimension
- Regression Trees: partitioning the covariate space
Lectures:
Module V: Classification
[-]
Collapse All[-]
- Logistic Regression
- Bayes Classifiers: linear and quadratic classifiers; naive Bayes classifiers using KDE
- Perceptrons for online learning and SVMs
- Boosting
Lectures:
- 15. May 21: MARS, classification trees
[MARS, classification trees slides] [MARS, classification trees annotated slides]
- 16. May 23: Classification intro, logistic regression
[Classification intro, logistic regression slides][Classification intro, logistic regression annotated slides]
- 17. May 28: LDA, QDA, KDE for classification, and Naive Bayes
[LDA, QDA, KDE, and Naive Bayes slides] [LDA, QDA, KDE, and Naive Bayes annotated slides]
- 18. May 30: Mixture models, online learning, and perceptron algorithm
[Online learning and perceptron slides][Online learning and perceptron annotated slides]
- 19. June 4: Kernelized perceptron and SVMs
[Kernelized perceptron and SVM slides][Kernelized perceptron and SVM annotated slides]
- 20. June 6: Multiclass SVMs and boosting
[Multiclass SVMs and boosting slides][Multiclass SVMs and boosting annotated slides]