Welcome to Kaiyi Ji's Homepage
About Me
I am an assistant professor at the Department of Computer Science and Engineering of the University at Buffalo, The State University of New York. I received my Ph.D. degree from the Electrical and Computer Engineering Department of The Ohio State University in December, 2021, advised by Prof. Yingbin Liang. I was a postdoctoral research fellow at the Electrical Engineering and Computer Science Department of the University of Michigan, Ann Arbor, in 2022, working with Prof. Lei Ying. I was a visiting student research collaborator at the department of Electrical Engineering, Princeton University.
Previously I obtained my B.S. degree from University of Science and Technology of China in 2016.
Prospective students: I do not have PhD positions now. However, intern and visiting students are welcome! Please send me an email with your CV and Transcript or fill this form.
Research
I have been working at the intersection of optimization, machine learning and wireless networking. In particular, I work on bilevel optimization, multi-task learning, transfer learning (meta-learning, continual learning, machine unlearning), distributed learning over networks, stochastic optimization and control, and their applications in signal processing, communication and image processing. Here are selected publications showcasing my current interest:
Continual Learning/Machine Unlearning
Multi-Objective/Task Learning
Bilevel Optimization: Theory and Applications
Imperative Learning: A Self-supervised Neural-Symbolic Learning Framework for Robot Autonomy Chen Wang, Kaiyi Ji, Junyi Geng, Zhongqiang Ren, Taimeng Fu, Fan Yang, Yifan Guo, Haonan He, Xiangyu Chen, Zitong Zhan, Qiwei Du, Shaoshu Su, Bowen Li, Yuheng Qiu, Yi Du, Qihang Li, Yifan Yang, Xiao Lin, Zhipeng Zhao.
Achieving O(\epsilon^{-1.5}) Complexity in Hessian/Jacobian-free Stochastic Bilevel Optimization Yifan Yang, Peiyao Xiao, Kaiyi Ji. Conference on Neural Information Processing Systems (NeurIPS) 2023.
Will Bilevel Optimizers Benefit from Loops Kaiyi Ji, Mingrui Liu, Yingbin Liang, Lei Ying Conference on Neural Information Processing Systems (NeurIPS) 2022. (Spotlight)
Lower Bounds and Accelerated Algorithms for Bilevel Optimization Kaiyi Ji, Yingbin Liang Journal of Machine Learning Research (JMLR) 2022.
Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms <Code> Kaiyi Ji, Junjie Yang, Yingbin Liang. International Conference on Machine Learning (ICML) 2021.
Distributed Learning over Networks
Recent News!
[Manuscript] 02/2024 One manuscript “Fair Resource Allocation in Multi-Task Learning” is available online. We connect fair resource allocation in wireless communication with multi-task learning, and propose an optimization method named FairGrad. This method implements different ideas of fairness and achieves SOTA performance among gradient manipulation MTL methods with performance guarantee. The idea has also been incorporated into existing MTL methods with significant improvements observed. Check our codes: Click.
[Manuscript] 02/2024 One manuscript “Discriminative Adversarial Unlearning” is available online. We introduce a novel machine unlearning framework founded on an attacker network and a defender network, where the attacker teases out the information of the data to be unlearned, and the defender unlearns to defend the network against the attack. We also incorporate a self-supervised objective to address the feature space discrepancies between the forget and validation sets. This method closely approximates the ideal benchmark of retraining from scratch in various scenarios. Code is available at Click.
[Talk] 10,11/2023 Glad to give multiple invited talks at INFORMS 2023 (Phoenix), Asilomar 2023 (Pacific Grove), MobiHoc 2023 (Washionton DC) about our recent progress on bilevel optimization for continual learning and network resource allocation.
|