Hancheng Min

profile.png

Tenure-track Associate Professor
Institute of Natural Sciences & School of Mathematical Sciences
Shanghai Jiao Tong University

I am a Tenure-track Associate Professor at the Institute of Natural Sciences (INS) and the School of Mathematics (SMS), Shanghai Jiao Tong Univeristy. My research centers around building mathematical principles that facilitates the interplay between machine learning and dynamical systems. Recently, I am mainly interested in analyzing gradient-based optimization algorithms on overparametrized neural networks from a dynamical system perspective.

Recent Updates

[Sep, 18, 2025] Our papers Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable Data and Convergence Rates for Gradient Descent on the Edge of Stability for Overparametrised Least Squares have been accepted to NeurIPS 2025 !
[Aug, 29, 2025] I officially joined INS and SMS at Shanghai Jiao Tong University!
[Jul, 16, 2025] Our paper Understanding Incremental Learning with Closed-form Solution to Gradient Flow on Overparamerterized Matrix Factorization is accepted to CDC 2025 !
[Jun, 25, 2025] Our paper Voyaging into Unbounded Dynamic Scenes from a Single View is accepted to ICCV 2025 !
[May, 01, 2025] Our paper Gradient Flow Provably Learns Robust Classifiers for Orthonormal GMMs is accepted to ICML 2025 ! See you in Vancouver!

Recent Publications

  1. nc.png
    Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable Data
    H. MinZ. Zhu, and R. Vidal
    Conference on Neural Information Processing Systems (NeurIPS), 2025 Abs
    to appear
  2. eos.png
    Convergence Rates for Gradient Descent on the Edge of Stability for Overparametrised Least Squares
    L. MacDonald, L. Palma, Z. Xu, H. Min, T. Salma, and R. Vidal
    Conference on Neural Information Processing Systems (NeurIPS), 2025 Abs
    to appear
  3. sv_dym.png
    Understanding Incremental Learning with Closed-form Solution to Gradient Flow on Overparamerterized Matrix Factorization
    H. Min, and R. Vidal
    IEEE Conference on Decision and Control (CDC), 2025 Abs arXiv Bib PDF
  4. robust_gmm.png
    Gradient Flow Provably Learns Robust Classifiers for Orthonormal GMMs
    H. Min, and R. Vidal
    International Conference on Machine Learning (ICML), 2025 Abs Bib PDF Poster

Selected Publications

  1. nc.png
    Neural Collapse under Gradient Flow on Shallow ReLU Networks for Orthogonally Separable Data
    H. MinZ. Zhu, and R. Vidal
    Conference on Neural Information Processing Systems (NeurIPS), 2025 Abs
    to appear
  2. robust_gmm.png
    Gradient Flow Provably Learns Robust Classifiers for Orthonormal GMMs
    H. Min, and R. Vidal
    International Conference on Machine Learning (ICML), 2025 Abs Bib PDF Poster
  3. coherence.gif
    A Frequency Domain Analysis of Slow Coherency in Networked Systems
    H. MinR. Pates, and E. Mallada
    Automatica, 2025 Abs arXiv Bib
  4. dir_flow.gif
    Early Neuron Alignment in Two-layer ReLU Networks with Small Initialization
    H. MinE. Mallada, and R. Vidal
    International Conference on Learning Representations (ICLR), 2024 Abs arXiv Bib PDF Poster Slides
  5. lin_conv.png
    On the Convergence of Gradient Flow on Multi-layer Linear Models
    H. MinR. Vidal, and E. Mallada
    International Conference on Machine Learning (ICML), 2023 Abs Bib PDF Poster Slides