DTC Seminar Series
Distributed Stochastic Optimization with Variance Reduction and Gradient Tracking
Tuesday, October 15, 2019
Seminar 1:00 p.m.
Reception 2:00 p.m.
401/402 Walter Library
There is an increasing need to perform large-scale machine learning and optimization over distributed networks, e.g. in the context of multi-agent learning and federated optimization. It is well recognized that, a careful balance of local computation and global communication is necessary in the distributed setting for empirical risk minimization. In this talk, we first consider a natural framework of distributing popular stochastic variance reduced methods in the master/slave setting, and establish its convergence guarantees under simple and intuitive assumptions that capture the effect of local data heterogeneity. Local regularization will be introduced to ensure convergence when data are highly unbalanced. Next, we move to the decentralized network setting, where each agent only aggregates information from its neighbors over a network topology. We first develop a communication-efficient approximate Newton-type algorithm, which is obtained by adjusting the global gradient estimate via a tracking term for the popular DANE algorithm (Shamir et. al., 2014). The same idea can be applied in a systematic manner, to obtain decentralized versions of other master/slave distributed algorithms, such as those employing variance reductions. Theoretical convergence and Numerical evidence are provided to demonstrate the appealing performance of our algorithms over competitive baselines, in terms of both communication and computation efficiency.
Dr. Yuejie Chi received the Ph.D. degree in Electrical Engineering from Princeton University in 2012, and the B.E. (Hon.) degree in Electrical Engineering from Tsinghua University, Beijing, China, in 2007. Since 2018, she is Robert E. Doherty Career Development Professor and Associate Professor with the department of Electrical and Computer Engineering at Carnegie Mellon University, after spending 5 years at The Ohio State University. She is interested in the mathematics of data science that take advantage of structures and geometry to minimize complexity and improve performance in decision making. Specific topics include mathematical and statistical signal processing, machine learning, large-scale optimization, sampling theory, with applications in sensing, imaging and data science. She is a recipient of the PECASE Award, NSF CAREER Award, AFOSR YIP Award, ONR YIP Award and IEEE SPS Young Author Paper Award.
- "Array processing and beamforming with Kronecker products," Israel Cohen, Technion – Israel Institute of Technology Haifa, Israel., Oct. 14, 3:00 pm, 401/402 Walter Library
- "Distributed Stochastic Optimization with Variance Reduction and Gradient Tracking," Yuejie Chi, Carnegie Mellon, Oct. 15, 1:00 pm, 401/402 Walter Library
- "Speech technologies at Google: An overview," Pedro J. Moreno, Research Engineering Director Google, Oct. 18, 3:00 pm, 101 Walter Library
DTC Professer Tian He named 2018 ACM Fellow
Professor Tian He has been named a 2018 Association for Computing Machinery (ACM) Fellow. He has been selected his contributions to wireless networks, sensing systems, and Internet of Things (IoT), areas in which he has made many influential and landmark contributions. More on Tian He.
Digital Technology Center
499 Walter Library, 117 Pleasant Street SE, Minneapolis, MN 55455
P: (612) 624-9510