Welcome to Kaiyi Ji's Homepage

alt text 

About Me

I am an assistant professor at the Department of Computer Science and Engineering of the University at Buffalo. I was a postdoctoral research fellow at the Electrical Engineering and Computer Science Department of the University of Michigan, Ann Arbor, in 2022, working with Prof. Lei Ying. I received my Ph.D. degree from the Electrical and Computer Engineering Department of The Ohio State University in December, 2021, advised by Prof. Yingbin Liang. I was a visiting student research collaborator at the department of Electrical Engineering, Princeton University working with Prof. H. Vincent Poor. Previously I obtained my B.S. degree from University of Science and Technology of China in 2016.


I have been working at the intersection of optimization, machine learning and networked systems, on both the theory and application sides. My major research focuses include:

  • Bilevel optimization and its application in deep learning

  • Meta-learning and continual learning

  • Large-scale stochastic optimization

  • Federated learning and communication networks

To Prospective Students

I am looking for highly motivated students with strong mathematical backgrounds and/or programming skills in machine learning and optimization to work with me.

Moreover, intern and visiting students are highly welcome!

One PhD position is avaliable immediately. Please fill the following form if you are interested. I will contact you if there is a good match!


  • 09/2023 Five papers accepted in NeurIPS 2023 with one spotlight presentation! The topics span over Hessian-free bilevel optimization, federated learning, continual learning and multi-objective learning. Big congratulations to my students Yifan, Peiyao and Hao, and many thanks to my collaborators!

  • 06/2023 Gave an invited talk on bilevel optimization and continual learning at SIAM Conference on Optimization.

  • 02/2023 Selected as Top Reviewer in AISTATS 2023.

  • 10/2022 Will serve as TPC member of ACM MobiHoc 2023. Please consider submitting your best work to the conference!

  • 10/2022 Gave an invited talk in the session “Bilevel Stochastic Methods for Optimization and Learning” at the 2022 INFORMS Annual Meeting, Indianapolis, IN.

  • New manuscript out: Will Bilevel Optimizers Benefit from Loops. We develop a unified convergence theory for AID- and ITD-based bilevel optimization, which covers all implementation choices of loops. Check it out!

  • Prof. Yingbin Liang (from OSU) and I are organizing an invited session on “Bilevel Machine Learning” for the 2022 Conference on Information Sciences and Systems (CISS) virtually from March 9 to 11, 2022. see <website> for more details.

  • 12/2021 Invited talk at the Next Generation Transportation Systems (NGTS) Seminar, University of Michigan.

  • I am honored to receive an outstanding reviewer award from NeurIPS 2021 (top 8% of reviewers)!

  • Our recent paper on momentum-based bilevel optimization has been accepted by NeurIPS 2021 with a spotlight presentation (3% acceptance rate)! Check our paper and code. More results will be added.

  • I am rewarded as expert reviewer and top 10% reviewer for ICML 2021 <Link>! Really honered!

  • Our paper on bilevel optimization has been accepted by ICML 2021. We provide comprehensive nonasympototic analysis and stochastic solver stocBiO for this area. Check our paper and code!

  • Our paper on theoretical analysis of MAML has been accepted by JMLR 2021.

  • Our paper on generalization error of GANs has been accepted by Trans. on Information Theory (TIT) 2021.

  • I am really honored to receive the 2020/21 Presidential Fellowship at OSU, the highest award for graduate students!

  • I was awarded as a top 10% high-scoring reviewer for NeurIPS 2020. Thanks!