{"id":391,"date":"2017-11-09T13:31:42","date_gmt":"2017-11-09T13:31:42","guid":{"rendered":"http:\/\/people.irisa.fr\/Remi.Gribonval\/?page_id=391"},"modified":"2020-11-20T16:20:46","modified_gmt":"2020-11-20T16:20:46","slug":"hdl","status":"publish","type":"page","link":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/talks-and-tutorials\/hdl\/","title":{"rendered":"Cours HDL (High Dimensional Statistical Learning) &#8211; M2 SIF"},"content":{"rendered":"<h1>High Dimensional Statistical Learning (HDL)<\/h1>\n<h2><a href=\"http:\/\/people.rennes.inria.fr\/Aline.Roumy\/roumy_teaching_HDL.html\">Click here for up to date information on this course<\/a><\/h2>\n<h2>Description<\/h2>\n<p>This module provides a detailed overview of the mathematical foundations of modern statistical learning by describing the theoretical basis and the conceptual tools needed to analyze and justify the algorithms.\u00a0 The emphasis is on problems involving high volumes of high dimensional datasets, and on dimension reduction techniques allowing to tackle them. The course involves detailed proofs of the main results and associated exercices.<\/p>\n<h2>Keywords<\/h2>\n<p>PAC (probably approximately correct), random projection, PCA (principal component analysis), concentration inequalities, measures of statistical complexity<\/p>\n<h2>Prerequisites<\/h2>\n<p>The prerequisites for this course include previous coursework in linear algebra, multivariate calculus, basic probability (continuous and discrete) and statistics.<br \/>\nPrevious coursework in convex analysis, information theory, and optimization theory would be helpful but is not required. Students are expected to be able to follow a rigorous proof<\/p>\n<h2>Content<\/h2>\n<ul>\n<li>The PAC framework (probably approximately correct) for statistical learning<\/li>\n<li>Measuring the complexity of a statistical learning problem<\/li>\n<li>Dimension reduction<\/li>\n<li>Sparsity and convex optimization for large scale learning (time allowing)<\/li>\n<li>Notion of algorithmic stability (time allowing)<\/li>\n<\/ul>\n<h2>Acquired skills<\/h2>\n<ul>\n<li>Understanding the links between complexity and overfitting<\/li>\n<li>Knowing the mathematical tools to measure learning complexity<\/li>\n<li>Understanding the statistical and algorithmic stakes of large-scale learning<\/li>\n<li>Understanding dimension reduction tools for learning<\/li>\n<\/ul>\n<h2>Teachers<\/h2>\n<p><a href=\"http:\/\/people.irisa.fr\/Remi.Gribonval\/\">R\u00e9mi Gribonval<\/a> (responsible until 2019), <a href=\"http:\/\/people.rennes.inria.fr\/Aline.Roumy\/\">Aline Roumy<\/a> (currently responsible, see <a href=\"http:\/\/people.rennes.inria.fr\/Aline.Roumy\/roumy_teaching_HDL.html\">new web page of the course<\/a> )<\/p>\n<p><a href=\"http:\/\/master.irisa.fr\/studentsarea\/people\/listes\/HDL.htm\">Etudiants (avec mot de passe)<\/a><\/p>\n<hr \/>\n<h2>Course schedule (2018-2019): see detailed times and rooms on <a href=\"https:\/\/planning.univ-rennes1.fr\/direct\/index.jsp?data=bd72d825015315fe4ffcef9380da990504b949237193e14b0d99039ab0257573373c564c3a1a1c734e9d01407a0f0c3d28cbe883c81d9ddeef015f9604cfa310f1dca9528ca7c323bedb6aab57edb34a3c6dd706b48748c3274a8c092faa7db8c365775eed9c7bed8aaf3320397f18f68241fc38732002142b121e958672a72d\">ENT (click ISTIC&gt;M2 SIF)<\/a><\/h2>\n<ul>\n<li>20-21\/11, <span style=\"color: #ff0000;\">28\/11, 4\/12, (<del>5\/12)<\/del><\/span>, 11-12\/12 R\u00e9mi Gribonval<\/li>\n<li>19\/12 Oral exam: chapter presentation<\/li>\n<li>8-9\/01, 15-16\/01 Aline Roumy<\/li>\n<li>22\/1 Written exam<\/li>\n<\/ul>\n<h2>Evaluation modalities (details to come)<\/h2>\n<ul>\n<li><strong>Chapter presentation: oral evaluation on 19\/12\/2018<\/strong> (each group of students will have to present the content of one chapter from the book of Shail Shalev-Shwarz &amp; Shai Ben-David linked below;\n<ul>\n<li>chapter 19 Nearest Neighbor; chapter 14 Stochastic Gradient Descent; chapter 20 Neural Networks; chapter 11 Model selection and validation; chapter 6 The VC dimension<\/li>\n<li>or a more advanced chapter; chapter 26 Rademacher Complexities; chapter 29 Multiclass Learnability<\/li>\n<\/ul>\n<\/li>\n<li><strong>Written exam on 22\/1\/2018<\/strong><\/li>\n<\/ul>\n<h2>Some references<\/h2>\n<ul>\n<li><a href=\"https:\/\/www.stat.berkeley.edu\/~mjwain\/stat210b\/Chap2_TailBounds_Jan22_2015.pdf\">Chapter of the future book of Martin Wainwright<\/a> (concentration inequalities)<\/li>\n<li>Book by Shai Shalev-Shwarz &amp; Shai Ben-David, <a href=\"http:\/\/www.cs.huji.ac.il\/~shais\/UnderstandingMachineLearning\/copy.html\">Understanding Machine Learning<\/a><\/li>\n<li>Book by Roman Vershynin, <a href=\"https:\/\/www.math.uci.edu\/~rvershyn\/papers\/HDP-book\/HDP-book.html\">High-Dimensional Probability &#8211; An Introduction with Applications in Data Science<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>High Dimensional Statistical Learning (HDL) Click here for up to date information on this course Description This module provides a detailed overview of the mathematical foundations of modern statistical learning by describing the theoretical basis and the conceptual tools needed to analyze and justify the algorithms.\u00a0 The emphasis is on problems involving high volumes of &hellip; <\/p>\n<p><a class=\"more-link btn\" href=\"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/talks-and-tutorials\/hdl\/\">Continue reading<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":44,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-391","page","type-page","status-publish","hentry","nodate","item-wrap"],"_links":{"self":[{"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/pages\/391","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/comments?post=391"}],"version-history":[{"count":18,"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/pages\/391\/revisions"}],"predecessor-version":[{"id":620,"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/pages\/391\/revisions\/620"}],"up":[{"embeddable":true,"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/pages\/44"}],"wp:attachment":[{"href":"https:\/\/perso.ens-lyon.fr\/remi.gribonval\/wp-json\/wp\/v2\/media?parent=391"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}