Hancheng Min

I am a Tenure-track Associate Professor at the Institute of Natural Sciences (INS) and the School of Mathematics (SMS), Shanghai Jiao Tong Univeristy. My research centers around building mathematical principles that facilitates the interplay between machine learning and dynamical systems. Recently, I am mainly interested in analyzing gradient-based optimization algorithms on overparametrized neural networks from a dynamical system perspective.
Recent Updates
[Sep, 18, 2025] Our papers Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable Data and Convergence Rates for Gradient Descent on the Edge of Stability for Overparametrised Least Squares have been accepted to NeurIPS 2025 !
[Aug, 29, 2025] I officially joined INS and SMS at Shanghai Jiao Tong University!
[Jul, 16, 2025] Our paper Understanding Incremental Learning with Closed-form Solution to Gradient Flow on Overparamerterized Matrix Factorization is accepted to CDC 2025 !
[Jun, 25, 2025] Our paper Voyaging into Unbounded Dynamic Scenes from a Single View is accepted to ICCV 2025 !
[May, 01, 2025] Our paper Gradient Flow Provably Learns Robust Classifiers for Orthonormal GMMs is accepted to ICML 2025 ! See you in Vancouver!
Recent Publications
- Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable DataConference on Neural Information Processing Systems (NeurIPS), 2025 Absto appear
- Convergence Rates for Gradient Descent on the Edge of Stability for Overparametrised Least SquaresConference on Neural Information Processing Systems (NeurIPS), 2025 Absto appear
Selected Publications
- Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable DataConference on Neural Information Processing Systems (NeurIPS), 2025 Absto appear