Preprints
Yiwei Liao*, Zhuorui Li, Shi Pu and Tsung-Hui Chang, A Robust Compressed Push-Pull Method for Decentralized Nonconvex Optimization, submitted.
Kun Huang*, Shi Pu and Angelia Nedić, An Accelerated Distributed Stochastic Gradient Method with Momentum, submitted.
Kun Huang*, Linli Zhou* and Shi Pu, Distributed Random Reshuffling Methods with Improved Convergence, submitted.
Journal Papers
Kun Huang* and Shi Pu, CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence, IEEE Transactions on Automatic Control (Full Paper), accepted.
Kun Huang*, Xiao Li and Shi Pu, Distributed Stochastic Optimization under a General Variance Condition, IEEE Transactions on Automatic Control (Full Paper), 69(9):6105-6120, 2024.
Zhuoqing Song*, Lei Shi, Shi Pu and Ming Yan, Provably Accelerated Decentralized Gradient Method Over Unbalanced Directed Graphs, SIAM Journal on Optimization, 34(1):1131-1156, 2024.
Zhuoqing Song*, Lei Shi, Shi Pu and Ming Yan, Optimal Gradient Tracking for Decentralized Optimization, Mathematical Programming, 207:1-53, 2024.
Kun Huang*, Xiao Li, Andre Milzarek, Shi Pu and Junwen Qiu, Distributed Random Reshuffling over Networks, IEEE Transactions on Signal Processing, 71:1143-1158, 2023.
Yijie Zhou* and Shi Pu, Private and Accurate Decentralized Optimization via Encrypted and Structured Functional Perturbation, IEEE Control Systems Letters, 7:1339-1344, 2023.
Kun Huang* and Shi Pu, Improving the Transient Times for Distributed Stochastic Gradient Methods, IEEE Transactions on Automatic Control (Full Paper), 68(7):4127-4142, 2023.
Zhuoqing Song*, Lei Shi, Shi Pu and Ming Yan, Compressed Gradient Tracking for Decentralized Optimization over General Directed Networks, IEEE Transactions on Signal Processing, 70:1775-1787, 2022.
Yiwei Liao*, Zhuorui Li*, Kun Huang* and Shi Pu, A Compressed Gradient Tracking Method for Decentralized Optimization with Linear Convergence, IEEE Transactions on Automatic Control, 67(10):5622-5629, 2022.
Shi Pu, Alex Olshevsky and Ioannis Ch. Paschalidis, A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent, IEEE Transactions on Automatic Control (Full Paper), 67(11):5900-5915, 2022.
Shi Pu and Angelia Nedić. Distributed Stochastic Gradient Tracking Methods. Mathematical Programming, 187(1):409-457, 2021. [arXiv]
Shi Pu, Wei Shi (co-first), Jinming Xu and Angelia Nedić. Push-Pull Gradient Methods for Distributed Optimization in Networks. IEEE Transactions on Automatic Control (Full Paper), 66(1):1-16, 2021. [arXiv]
Ran Xin, Shi Pu, Angelia Nedić and Usman Khan. A General Framework for Decentralized Optimization with First-order Methods. Proceedings of the IEEE, 108(11):1869-1889, 2020.
Shi Pu, Alex Olshevsky and Ioannis Ch. Paschalidis, Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning: Examining Distributed and Centralized Stochastic Gradient Descent. IEEE Signal Processing Magazine, 37(3):114-122, 2020. [arXiv].
Shi Pu, J. Joaquin Escudero-Garzás, Alfredo Garcia and Shahin Shahrampour. An Online Mechanism for Resource Allocation in Networks. IEEE Transactions on Control of Network Systems, 7(3):1140-1150, 2020.
Shi Pu and Alfredo Garcia. Swarming for Faster Convergence in Stochastic Optimization. SIAM Journal on Control and Optimization, 56(4):2997-3020, 2018. [arXiv]
Shi Pu and Alfredo Garcia. A Flocking-based Approach for Distributed Stochastic Optimization. Operations Research, 66(1):267-281, 2018. [arXiv]
Shi Pu, Alfredo Garcia and Zongli Lin. Noise Reduction by Swarming in Social Foraging. IEEE Transactions on Automatic Control, 61(12):4007-4013, 2016.
Book Chapter
- Alfredo Garcia, Bingyu Wang* and Shi Pu. Distributed Optimization. In: Pardalos, P.M., Prokopyev, O.A. (eds) Encyclopedia of Optimization. Springer, Cham.
Conference Papers
Runze You* and Shi Pu, B-ary Tree Push-Pull Method is Provably Efficient for Decentralized Learning on Heterogeneous Data, Advances in Neural Information Processing Systems (NeurIPS), 2024.
Yiwei Liao*, Zhuorui Li and Shi Pu, A Linearly Convergent Robust Compressed Push-Pull Method for Decentralized Optimization, 2023 IEEE 62th Conference on Decision and Control (CDC).
Shi Pu, A Robust Gradient Tracking Method for Distributed Optimization over Directed Networks, 2020 IEEE 59th Conference on Decision and Control (CDC).
Shi Pu and Angelia Nedić. A Distributed Stochastic Gradient Tracking Method. 2018 IEEE 57th Conference on Decision and Control (CDC).
Shi Pu, Wei Shi, Jinming Xu and Angelia Nedić. A Push-Pull Gradient Method for Distributed Optimization in Networks. 2018 IEEE 57th Conference on Decision and Control (CDC).
*(co-)supervised student/postdoc