We propose a new asynchronous parallel block-descent algorithmic framework for the minimization of the sum of a smooth nonconvex function and a nonsmooth convex one, subject to both convex and nonconvex constraints. The proposed framework hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchronous implementations in a more faithful way than current state-of-the-art models. Other key features of the framework are: (1) it covers in a unified way several specific solution methods; (2) it accommodates a variety of possible parallel computing architectures; and (3) it can deal with nonconvex constraints. Almost sure convergence to stationary solutions is proved, and theoretical complexity results are provided, showing nearly ideal linear speedup when the number of workers is not too large.
Dettaglio pubblicazione
2020, MATHEMATICAL PROGRAMMING, Pages 121-154 (volume: 184)
Asynchronous parallel algorithms for nonconvex optimization (01a Articolo in rivista)
Cannelli L., Facchinei F., Kungurtsev V., Scutari G.
Gruppo di ricerca: Continuous Optimization
keywords