Shi Pu’s Homepage
Welcome! I am an assistant professor in the School of Data Science at The Chinese University of Hong Kong, Shenzhen.
Research Interests
Distributed optimization, machine learning, multi-agent networks.
See here for our research topics, and here for our publications.
Openings
We are always looking for self-motivated students with solid mathematical background and research interests in optimization, machine learning, distributed algorithms, game theory, etc. Some info can be found here (in Chinese).
Recent News
- September. 2024: Our paper, CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence (with Kun Huang), has been accepted for publication in IEEE Transactions on Automatic Control as Full Paper!
- September. 2024: Our paper, B-ary Tree Push-Pull Method is Provably Efficient for Decentralized Learning on Heterogeneous Data (with Runze You), has been accepted by NeurIPS 2024!
- August. 2024: Our paper, A Robust Compressed Push-Pull Method for Decentralized Nonconvex Optimization (with Yiwei Liao, Zhuorui Li and Tsung-Hui Chang), is online!
- April. 2024: Our paper, Distributed Stochastic Optimization under a General Variance Condition (with Kun Huang and Xiao Li), has been accepted for publication in IEEE Transactions on Automatic Control as Full Paper!
- April. 2024: Our paper, B-ary Tree Push-Pull Method is Provably Efficient for Decentralized Learning on Heterogeneous Data (with Runze You), is online!
- February. 2024: Our paper, An Accelerated Distributed Stochastic Gradient Method with Momentum (with Kun Huang and Angelia Nedić), is online!
- December. 2023: Our paper, Provably Accelerated Decentralized Gradient Method Over Unbalanced Directed Graphs (with Zhuoqing Song, Lei Shi and Ming Yan) has been accepted for publication in SIAM Journal on Optimization!
- August. 2023: I became an IEEE senior member.
- August. 2023: I gave a talk titled “Distributed Stochastic Gradient Methods over Networks” during The 14th International Conference on Numerical Optimization and Numerical Linear Algebra.
- July. 2023: Our paper, A Linearly Convergent Robust Compressed Push-Pull Method for Decentralized Optimization (with Yiwei Liao and Zhuorui Li), has been accepted by 2023 IEEE Conference on Decision and Control!
- Jun. 2023: Our new paper, Distributed Random Reshuffling Methods with Improved Convergence (with Linli Zhou and Kun Huang), is online!
- Jun. 2023: Our paper, Optimal Gradient Tracking for Decentralized Optimization (with Zhuoqing Song, Lei Shi and Ming Yan) has been accepted for publication in Mathematical Programming!
- May. 2023: I gave an invited talk titled “Asymptotic Network Independence in Distributed Stochastic Gradient Methods” during MOS2023. Thank Prof. Xiangfeng Wang, Prof. Hongjin He and Prof. Wenxing Zhang for organizing!
- Mar. 2023: Our paper, A Linearly Convergent Robust Compressed Push-Pull Method for Decentralized Optimization (with Yiwei Liao and Zhuorui Li), is online!
- Mar. 2023: Our paper, Distributed Random Reshuffling over Networks (with Kun Huang, Xiao Li, Andre Milzarek, and Junwen Qiu), has been accepted for publication in IEEE Transactions on Signal Processing!
- Jan. 2023: Our paper, Distributed Stochastic Optimization under a General Variance Condition (with Kun Huang and Xiao Li), is online!
- Jan. 2023: Our paper, CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence (with Kun Huang), is online!
- Dec. 2022: Our paper, Private and Accurate Decentralized Optimization via Encrypted and Structured Functional Perturbation (with Yijie Zhou), has been accepted for publication in IEEE Control Systems Letters!
- Nov. 2022: On November 12, I was invited to give an online talk titled “Asymptotic Network Independence in Distributed Stochastic Gradient Methods” at 智能无人系统协同优化与博弈论坛. Thank Prof. Yanan Zhu!
- Sep. 2022: Our paper, Private and Accurate Decentralized Optimization via Encrypted and Structured Functional Perturbation (with Yijie Zhou), is online!
- August. 2022: Congratulations to Kun Huang for winning the presentation runner-up prize at The Second Doctoral and Postdoctoral Daoyuan Academic Forum!
- August. 2022: Our paper, Improving the Transient Times for Distributed Stochastic Gradient Methods (with Kun Huang), has been accepted for publication as a full paper in IEEE Transactions on Automatic Control!
- July. 2022: I became a member of the Conference Editorial Board (CEB) of the IEEE Control Systems Society (CSS).
- July. 2022: We organized an invited session titled “Recent Advances in Distributed Optimization and Learning” in Chinese Control Conference (CCC2022) (with Jinming Xu and Huan Li)!
- May. 2022: Our paper, A Compressed Gradient Tracking Method for Decentralized Optimization with Linear Convergence (with Yiwei Liao, Zhuorui Li and Kun Huang), has been accepted for publication in IEEE Transactions on Automatic Control!
- Mar. 2022: Our paper, Compressed Gradient Tracking for Decentralized Optimization over General Directed Networks (with Zhuoqing Song, Lei Shi and Ming Yan), has been accepted for publication in IEEE Transactions on Signal Processing!
- Dec. 2021: Our paper, Distributed Random Reshuffling over Networks (with Kun Huang, Xiao Li, Andre Milzarek, and Junwen Qiu), is online!
- Nov. 2021: On November 14, I was invited to give an online talk titled “Asymptotic Network Independence in Distributed Stochastic Gradient Methods” at CAA YeS Forum. Thank Prof. Xiuxian Li for hosting!
- Oct. 2021: Our paper, A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent (with A. Olshevsky and I.C. Paschalidis), has been accepted for publication in IEEE Transactions on Automatic Control!
- Oct. 2021: Our paper, Optimal Gradient Tracking for Decentralized Optimization (with Zhuoqing Song, Lei Shi and Ming Yan), is online! In this paper, we propose an Optimal Gradient Tracking (OGT) method that achieves the optimal gradient computation complexity and the optimal communication complexity simultaneously (for smooth and strongly convex objective functions)!
- July. 2021: I gave a talk titled “Asymptotic Network Independence in Distributed Stochastic Gradient Methods” at SIAM Conference on Optimization (OP21).
- July. 2021: Our paper, Provably Accelerated Decentralized Gradient Method Over Unbalanced Directed Graphs (with Zhuoqing Song, Lei Shi and Ming Yan), is online!
- June. 2021: Our paper, Compressed Gradient Tracking for Decentralized Optimization over General Directed Networks (with Zhuoqing Song, Lei Shi and Ming Yan), is online!
- May. 2021: Our paper, Improving the Transient Times for Distributed Stochastic Gradient Methods (with Kun Huang), is online!
- May. 2021: We are organizing a Minisymposium for The SIAM Conference on Optimization (OP21) this July (with Jinming Xu, Jie Lu and Hoi To Wai).
- Mar. 2021: Our paper, Compressed Gradient Tracking Methods for Decentralized Optimization with Linear Convergence (with Yiwei Liao, Zhuorui Li and Kun Huang), is online!
- Mar. 2021: We are organizing an invited session for CDC 2021 this December in Austin, Texas (with Jinming Xu and Ming Yan).
- Feb. 2021: On February 3, I was invited to give an online talk at Peking University. Thank Prof. Zhongkui Li for hosting!