Finishing a memorable summer at MIT (thanks MIT for amazing website hosting service), I moved my homepage to https://tianle.website/! This page will not be maintained. Thanks!
I am a junior student majoring in applied mathematics at Peking University (PKU) while pursuing a double major in computer science. I am doing undergraduate research about machine learning theory especially deep learning theory advised by Professor Liwei Wang. I am mostly interested on the theory that can inspire us make better algorithms. I am going to MIT for summer research intern supervised by Professor Sasha Rakhlin in 2019. I also spend a wonderful time working with Professor Jason D. Lee.
I will apply for Ph.D this year!
(Preprint) A Gram-Gauss-Newton Method Learning Overparameterized Deep Neural Networks for Regression ProblemsTianle Cai*, Ruiqi Gao*, Jikai Hou*, Siyu Chen, Dong Wang, Di He, Zhihua Zhang, Liwei Wang
Highlight: A provable second-order optimization method for overparameterized network on regression problem! As light as SGD at each iteration but converge much faster than SGD for real world application.
Highlight: Though robust generalization need more data, we show that just more unlabeled data is enough by both theory and experiments!
Highlight: For overparameterized neural network, we prove that adversarial training can converge to global minima (with loss 0).
- A Gram-Gauss-Newton Method Learning Overparameterized Deep Neural Networks for Regression Problems at PKU machine learning workshop [slides]
- Visiting Research Student at Simons Institute, UC Berkeley
- Program: Foundations of Deep Learning
- June, 2019 - July, 2019
- Visiting Research Internship at MIT
- Advisor: Professor Sasha Rakhlin
- June, 2019 - Sept., 2019
- Visiting Research Student at Princeton
- Host: Professor Jason D. Lee
- Sept., 2019 - Oct., 2019