M1 course, ENS Lyon: Machine Learning

Machine Learning

Project:

  1. Intermediate delivery: March 19th (you provide a first solution for an early feedback)
  2. 10' videos due Sunday April 11th at last (please send us a link valid up to April 15th at least); the video should be inspired by the report writing guidelines
  3. Defenses: Wednesday April 14th pm in presence, salle Condorcet online, due to the latest COVID measures (we use BBB of "portail des études" by default and Zoom in case of a technical problem
    Schedule:
Give a look at the others' videos! (links are above) and don't forget to fill the form and give your feedback on the course!

Distant access to online lectures:

BBB link (warning! has changed now to the BBB of "portail des études") for the TDs and for the lectures by Aurélien Garivier
Zoom link for the lectures by Yohann de Castro (19.01, 02.02, 23.02, 9.03 and 23.03)
BBB temporaire backup discord server: please create an account in case of a problem!
Lectures: Tuesday 10:15-12:15
Hands-on sessions: Thursday 15:45-17:45

Homework 1
Homework 2 due March 26th: please send to Yohann de Castro
Homework 3 due April 9th: please send to Aurélien Garivier

Lecturers

Yohann de Castro, Aurélien Garivier

Course description

The aim of this Master 1 Informatique Fondamentale course is to introduce the basic theory and algorithms of Machine Learning. Topics to be taught (may be modified) ~20h of lectures + 20h of lab sessions.

  • General introduction to Machine Learning: learning settings, curse of dimensionality, overfitting/underfitting, etc.
  • Overview of Supervised Learning Theory: True risk versus empirical risk, loss functions, regularization, bias/variance trade-off, complexity measures, generalization bounds.
  • Linear/Logistic/Polynomial Regression: batch/stochastic gradient descent, closed-form solution.
  • Sparsity in Convex Optimization.
  • Support Vector Machines: large margin, primal problem, dual problem, kernelization, etc.
  • Neural Networks, Deep Learning.
  • Theory of boosting: Ensemble methods, Adaboost, theoretical guarantees.
  • Non-parametric Methods (K-Nearest-Neighbors)
  • Domain Adaptation
  • Metric Learning
  • Optimal Transport

Class Notes and Exercises

Prerequisite

Basic knowledge of probability theory, linear algebra and analysis over the reals

Evaluation

50% final exam, 50% project and in-class exercises.

References

  • Statistical Learning Theory, V. Vapnik, Wiley, 1998
  • Machine Learning, Tom Mitchell, MacGraw Hill, 1997
  • Pattern Recognition and Machine Learning, M. Bishop, 2013
  • Convex Optimization, Stephen Boyd & Lieven Vandenberghe, Cambridge University Press, 2012.
  • On-line Machine Learning courses: https://www.coursera.org/