Sangam: A Confluence of Knowledge Streams

Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions

Show simple item record

dc.creator Aybat, Necdet Serhat
dc.creator Fallah, Alireza
dc.creator Gürbüzbalaban, Mert
dc.creator Ozdaglar, Asuman
dc.date 2021-10-27T19:56:30Z
dc.date 2021-10-27T19:56:30Z
dc.date 2020
dc.date 2021-02-03T16:32:27Z
dc.date.accessioned 2023-03-01T17:59:33Z
dc.date.available 2023-03-01T17:59:33Z
dc.identifier https://hdl.handle.net/1721.1/133761
dc.identifier.uri http://localhost:8080/xmlui/handle/CUHPOERS/278334
dc.description © 2020 Society for Industrial and Applied Mathematics. We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. We focus on gradient descent and accelerated gradient (AG) methods for minimizing strongly convex functions when the gradient has random errors in the form of additive white noise. With gradient errors, the function values of the iterates need not converge to the optimal value; hence, we define the robustness of an algorithm to noise as the asymptotic expected suboptimality of the iterate sequence to input noise power. For this robustness measure, we provide exact expressions for the quadratic case using tools from robust control theory and tight upper bounds for the smooth strongly convex case using Lyapunov functions certified through matrix inequalities. We use these characterizations within an optimization problem which selects parameters of each algorithm to achieve a particular trade-off between rate and robustness. Our results show that AG can achieve acceleration while being more robust to random gradient errors. This behavior is quite different than previously reported in the deterministic gradient noise setting. We also establish some connections between the robustness of an algorithm and how quickly it can converge back to the optimal solution if it is perturbed from the optimal point with deterministic noise. Our framework also leads to practical algorithms that can perform better than other state-of-the-art methods in the presence of random gradient noise.
dc.format application/pdf
dc.language en
dc.publisher Society for Industrial & Applied Mathematics (SIAM)
dc.relation 10.1137/19M1244925
dc.relation SIAM Journal on Optimization
dc.rights Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
dc.source SIAM
dc.title Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
dc.type Article
dc.type http://purl.org/eprint/type/JournalArticle


Files in this item

Files Size Format View
19m1244925.pdf 775.9Kb application/pdf View/Open

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse