Hancheng Min
I am a Tenure-track Associate Professor at the Institute of Natural Sciences (INS) and the School of Mathematics (SMS), Shanghai Jiao Tong Univeristy. My research centers around building mathematical principles that facilitates the interplay between machine learning and dynamical systems. Recently, I am mainly interested in analyzing gradient-based optimization algorithms on overparametrized neural networks from a dynamical system perspective.
Recent Updates
[May, 01, 2026] Our paper Transformers Learn the Optimal DDPM Denoiser for Multi-Token GMMs is accepted to ICML 2026 !
[Feb, 11, 2026] Our tutorial paper On the Convergence, Implicit Bias and Edge of Stability of Gradient Descent in Deep Learning has been accepted to IEEE Signal Processing Magazine !
[Dec, 10, 2025] I gave a talk Understanding Incremental Learning with Closed-form Solution to Gradient Flow on Overparamerterized Matrix Factorization at CDC 2025 at Rio
[Nov, 23, 2025] I gave a talk Learning Dynamics in the Feature Learning Regime: Implicit Bias, Neural Collapse, and Robustness at NYU, Shanghai
[Sep, 18, 2025] Our papers Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable Data and Convergence Rates for Gradient Descent on the Edge of Stability for Overparametrised Least Squares have been accepted to NeurIPS 2025 !
Recent Publications
- Transformers Learn the Optimal DDPM Denoiser for Multi-Token GMMsInternational Conference on Machine Learning (ICML), Jul 2026to appear
- On the Convergence, Implicit Bias and Edge of Stability of Gradient Descent in Deep LearningIEEE Signal Processing Magazine (IEEE SPM), May 2026To appear