Luo Luo
Assistant Professor
School of Data Science
Fudan University
Email: luoluo (AT) fudan (DOT) edu (DOT) cn
My research interests include machine learning, optimization and linear algebra.
Teaching
- Fall 2023: Multivariate Statistical Analysis (DATA 130044)
- Spring 2023: Optimization Theory (DATA 620020)
- Fall 2022: Multivariate Statistical Analysis (DATA 130044)
- Spring 2022: Multivariate Statistics (MATH 620156)
- Fall 2020: Calculus IB (MATH 1013)
Preprints
- Haikuo Yang, Luo Luo, Chris Junchi Li and Michael I. Jordan.
Accelerating Inexact HyperGradient Descent for Bilevel Optimization.
arXiv preprint:2307.00126, 2023.
[pdf]
- Chengchang Liu, Cheng Chen and Luo Luo.
Symmetric Rank-k Methods.
arXiv preprint:2303.16188, 2023.
[pdf]
- Chengchang Liu and Luo Luo.
Regularized Newton Methods for Monotone Variational Inequalities with Hölder Continuous Jacobians.
arXiv preprint:2212.07824, 2022.
[pdf]
- Lesi Chen and Luo Luo.
Near-Optimal Algorithms for Making the Gradient Small in Stochastic Minimax Optimization.
arXiv preprint:2208.05925, 2022.
[pdf]
Conference Publications
- Chengchang Liu, Cheng Chen, Luo Luo and John C.S. Lui.
Block Broyden's Methods for Solving Nonlinear Equations.
Advances in Neural Information Processing Systems (NeurIPS), 2023.
- Chengchang Liu, Lesi Chen, Luo Luo and John C.S. Lui.
Communication Efficient Distributed Newton Method with Fast Convergence Rates.
ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2023.
[pdf]
- Lesi Chen, Jing Xu and Luo Luo.
Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization.
International Conference on Machine Learning (ICML), 2023.
[pdf]
- Chengchang Liu and Luo Luo.
Quasi-Newton Methods for Saddle Point Problems.
Advances in Neural Information Processing Systems (NeurIPS), 2022.
[pdf] [Longer Version]
- Lesi Chen, Boyuan Yao and Luo Luo.
Faster Stochastic Algorithms for Minimax Optimization under Polyak-Łojasiewicz Condition.
Advances in Neural Information Processing Systems (NeurIPS), 2022.
[pdf]
- Luo Luo, Yujun Li and Cheng Chen.
Finding Second-Order Stationary Points in Nonconvex-Strongly-Concave Minimax Optimization.
Advances in Neural Information Processing Systems (NeurIPS), 2022.
[pdf]
- Chengchang Liu, Shuxian Bi, Luo Luo and John C.S. Lui.
Partial-Quasi-Newton Methods: Efficient Algorithms for Minimax Optimization Problems with Unbalanced Dimensionality.
ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2022. Best Paper Runner-Up
[pdf]
- Luo Luo, Cheng Chen, Guangzeng Xie and Haishan Ye.
Revisiting Co-Occurring Directions: Sharper Analysis and Efficient Algorithm for Sparse Matrices.
AAAI Conference on Artificial Intelligence (AAAI), 2021.
[pdf]
- Luo Luo, Haishan Ye, Zhichao Huang and Tong Zhang.
Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems.
Advances in Neural Information Processing Systems (NeurIPS), 2020.
[pdf]
- Cheng Chen, Luo Luo, Weinan Zhang and Yong Yu.
Efficient Projection-Free Algorithms for Saddle Point Problems.
Advances in Neural Information Processing Systems (NeurIPS), 2020.
[pdf]
- Haishan Ye, Ziang Zhou, Luo Luo and Tong Zhang.
Decentralized Accelerated Proximal Gradient Descent.
Advances in Neural Information Processing Systems (NeurIPS), 2020.
[pdf]
- Guangzeng Xie, Luo Luo, Yijiang Lian and Zhihua Zhang.
Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems.
International Conference on Machine Learning (ICML), 2020.
[pdf]
- Cheng Chen, Luo Luo, Weinan Zhang, Yong Yu and Yijiang Lian.
Efficient and Robust High-Dimensional Linear Contextual Bandits.
International Joint Conference on Artificial Intelligence (IJCAI), 2020.
[pdf]
- Luo Luo, Wenpeng Zhang, Zhihua Zhang, Wenwu Zhu, Tong Zhang and Jian Pei.
Sketched Follow-The-Regularized-Leader for Online Factorization Machine.
ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2018.
[pdf]
- Haishan Ye, Luo Luo and Zhihua Zhang.
Approximate Newton Methods and Their Local Convergence.
International Conference on Machine Learning (ICML), 2017.
[pdf]
- Zihao Chen, Luo Luo and Zhihua Zhang.
Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features.
AAAI Conference on Artificial Intelligence (AAAI), 2017.
[pdf]
- Tianfan Fu, Luo Luo and Zhihua Zhang.
Quasi-Newton Hamiltonian Monte Carlo.
Conference on Uncertainty in Artificial Intelligence (UAI), 2016.
[pdf]
- Qiaomin Ye, Luo Luo and Zhihua Zhang.
Frequent Direction Algorithms for Approximate Matrix Multiplication with Applications in CCA.
International Joint Conference on Artificial Intelligence (IJCAI), 2016.
[pdf]
- Luo Luo, Yubo Xie, Zhihua Zhang and Wu-Jun Li.
Support Matrix Machines.
International Conference on Machine Learning (ICML), 2015.
[pdf]
- Zhiquan Liu, Luo Luo and Wu-Jun Li.
Robust Crowdsourced Learning.
IEEE International Conference on Big Data, 2013.
[pdf]
Journal Publications
- Haishan Ye, Luo Luo, Ziang Zhou and Tong Zhang.
Multi-Consensus Decentralized Accelerated Gradient Descent.
Journal of Machine Learning Research (JMLR), 2023.
- Haishan Ye, Luo Luo and Zhihua Zhang.
Approximate Newton Methods.
Journal of Machine Learning Research (JMLR), 22(66):1-41, 2021.
[pdf]
- Haishan Ye, Luo Luo and Zhihua Zhang.
Accelerated Proximal Sub-Sampled Newton Method.
IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2020.
[pdf]
- Haishan Ye, Luo Luo and Zhihua Zhang.
Nesterov's Acceleration for Approximate Newton.
Journal of Machine Learning Research (JMLR), 21(142):1-37, 2020.
[pdf]
- Luo Luo, Cheng Chen, Zhihua Zhang, Wu-Jun Li and Tong Zhang.
Robust Frequent Directions with Application in Online Learning.
Journal of Machine Learning Research (JMLR), 20(45):1-41, 2019.
[pdf]
- Haishan Ye, Guangzeng Xie, Luo Luo and Zhihua Zhang.
Fast Stochastic Second-Order Method Logarithmic in Condition Number.
Pattern Recognition, 88:629-642, 2019.
[pdf]
- Shusen Wang, Luo Luo and Zhihua Zhang.
SPSD Matrix Approximation vis Column Selection: Theories, Algorithms and Extensions.
Journal of Machine Learning Research (JMLR), 17(49):1-49, 2016.
[pdf]