Skip to content

Module I: Preliminaries of Nonparametric Regression [-] Collapse All[-]

  • Introduction: course overview; example tasks
  • Optimal Predictions and Measures of Accuracy: loss functions; predictive risk; bias-variance trade-off
  • Linear Smoothers: definition; basic examples
  • A First Look at Shrinkage Methods: ridge regression; lasso
  • Choosing the Smoothing Parameter: analytic approaches; cross validation

Lectures:


Module II: Splines and Kernel Methods [-] Collapse All[-]

  • Introduction: brief overview
  • Spline Methods: piecewise polynomials; natural cubic splines; smoothing splines; B-splines; penalized regression splines
  • Kernel Methods: kernel density estimation; the Nadaraya-Watson kernel estimator; local polynomial regression
  • Inference for Linear Smoothers: variance estimation; confidence bands
  • Spline and Kernel Methods for GLMs: extensions of spline and kernel methods to binomial, Poisson, gamma, etc, data

Lectures:


Module III: Bayesian Nonparametrics [-] Collapse All[-]

  • Introduction: principles of Bayesian nonparametrics
  • Regression via Gaussian processes
  • Density estimation via Dirichlet process mixture of Gaussians

Lectures:


Module IV: Nonparametrics with Multiple Predictors [-] Collapse All[-]

  • Introduction: issues when considering multiple predictors
  • Generalized Additive Models: GAMs; the backfitting algorithm
  • Spline Methods in Several Variables: natural thin plate splines; thin plate regression splines; tensor product splines
  • Kernel Methods in Several Variables: extending kernel methods to multidimensional covariates
  • Smoothing Parameter Estimation: how to choose level of smoothing in more than one dimension
  • Regression Trees: partitioning the covariate space

Lectures:


Module V: Classification [-] Collapse All[-]

  • Logistic Regression
  • Bayes Classifiers: linear and quadratic classifiers; naive Bayes classifiers using KDE
  • Perceptrons for online learning and SVMs
  • Boosting

Lectures: