Instructor: Mingrui Liu
Time/Location
Office Hours:
Contact Information:
Office: Research Hall 355
Recommended Textbooks:
Tong Zhang. Mathematical Analysis of Machine Learning Algorithms.
Aaron Courville, Ian Goodfellow, and Yoshua Bengio. Deep Learning.
Matus Telgarsky. Deep Learning Theory Lecture Notes.
Sanjeev Arora et al. Deep Learning Theory Lecture Notes.
Week 1 (Jan 16, Jan 18): Course Overview, Challenges in Modern Machine Learning (Deep Learning)
Week 2 (Jan 23, Jan 25): Generalization in Machine Learning: Concentration Inequalities
Week 3 (Jan 30, Feb 1): Generalization in Machine Learning: Uniform Convergence, Complexity Measure
Week 4 (Feb 6, Feb 8): Nonconvex Optimization, Deep Learning (I)
Week 5 (Feb 13, Feb 15): Nonconvex Optimization, Deep Learning (II)
Week 6 (Feb 20, Feb 22): Implicit Regularization (I)
Week 7 (Feb 27, Feb 29): Implicit Regularization (II)
Week 8 (Mar 5, Mar 7): Spring Break, No Classes
Week 9 (Mar 12, Mar 14): Benign Overfitting (I)
Week 10 (Mar 19, Mar 21): Benign Overfitting (II)
Week 11 (Mar 26, Mar 28): Benign Overfitting (III)
Week 12 (Apr 2, Apr 4): Representation Learning (I)
Week 13 (Apr 9, Apr 11): Representation Learning (II)
Week 14 (Apr 16, Apr 18): Project Presentation
Week 15 (Apr 23, Apr 25): Project Presentation
Grades:
A: greater than 93
A-: [90, 93)
B+: [87, 90)
B: [83, 87)
B-: [80, 83)
C+: [77, 80)
C: [73, 77)
C-: [68, 73)
D: [60, 68)
Fail: below 60
Weights:
Paper presentation/Reading Report: 30%
Class Participation: 10%
Homeworks: 30% (5 in total)
Project Report/Project Presentation: 30%
Late penalty:
Late submissions for whatever reason will be punished. 5% of the score of an assignment/project will be deducted per day with maximum tolerance 3 days. For example, if an assignment is submitted 2 days and 1 minute later than the deadline (counted as 3 days) and it gets a grade of 90%, then the score after the deduction will be: 95% - 3*5% = 80%.