首页  >>  来自播客: Lex Fridman 更新   反馈

Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series

发布时间 2020-02-15 15:11:09    来源

摘要

Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series. Slides: http://bit.ly/2ORVofC Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM Series website: https://deeplearning.mit.edu Playlist: http://bit.ly/deep-learning-playlist OUTLINE: 0:00 - Introduction 0:46 - Overview: Complete Statistical Theory of Learning 3:47 - Part 1: VC Theory of Generalization 11:04 - Part 2: Target Functional for Minimization 27:13 - Part 3: Selection of Admissible Set of Functions 37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS) 53:16 - Part 5: LUSI Approach in Neural Networks 59:28 - Part 6: Examples of Predicates 1:10:39 - Conclusion 1:16:10 - Q&A: Overfitting 1:17:18 - Q&A: Language CONNECT: - If you enjoyed this video, please subscribe to this channel. - Twitter: https://twitter.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/lexfridman - Instagram: https://www.instagram.com/lexfridman

GPT-4正在为你翻译摘要中......

中英文字稿