Generalized Uniformly Optimal Methods for Nonlinear Programming. Ghadimi, S., Lan, G., & Zhang, H. arXiv:1508.07384 [math, stat], September, 2015. 10 citations (Semantic Scholar/arXiv) [2023-03-03] ZSCC: NoCitationData[s0] arXiv: 1508.07384
Generalized Uniformly Optimal Methods for Nonlinear Programming [link]Paper  abstract   bibtex   
Uniformly optimal convex programming algorithms have been designed to achieve the optimal complexity bounds for convex optimization problems regardless of the level of smoothness of the objective function. In this paper, we present a generic framework to extend such existing algorithms to solve more general nonlinear, possibly nonconvex, optimization problems. The basic idea is to incorporate a local search step (gradient descent or Quasi-Newton iteration) into the uniformly optimal convex programming methods, and then enforce a monotone decreasing property of the function values computed along the trajectory. While optimal methods for nonconvex programming are not generally known, algorithms of these types will achieve the best known complexity for nonconvex problems, and the optimal complexity for convex ones without requiring any problem parameters. As a consequence, we can have a unified treatment for a general class of nonlinear programming problems regardless of their convexity and smoothness level. In particular, we show that the accelerated gradient and level methods, both originally designed for solving convex optimization problems only, can be used for solving both convex and nonconvex problems uniformly. In a similar vein, we show that some well-studied techniques for nonlinear programming, e.g., Quasi-Newton iteration, can be embedded into optimal convex optimization algorithms to possibly further enhance their numerical performance. Our theoretical and algorithmic developments are complemented by some promising numerical results obtained for solving a few important nonconvex and nonlinear data analysis problems in the literature.
@article{ghadimi_generalized_2015,
	title = {Generalized {Uniformly} {Optimal} {Methods} for {Nonlinear} {Programming}},
	url = {http://arxiv.org/abs/1508.07384},
	abstract = {Uniformly optimal convex programming algorithms have been designed to achieve the optimal complexity bounds for convex optimization problems regardless of the level of smoothness of the objective function. In this paper, we present a generic framework to extend such existing algorithms to solve more general nonlinear, possibly nonconvex, optimization problems. The basic idea is to incorporate a local search step (gradient descent or Quasi-Newton iteration) into the uniformly optimal convex programming methods, and then enforce a monotone decreasing property of the function values computed along the trajectory. While optimal methods for nonconvex programming are not generally known, algorithms of these types will achieve the best known complexity for nonconvex problems, and the optimal complexity for convex ones without requiring any problem parameters. As a consequence, we can have a unified treatment for a general class of nonlinear programming problems regardless of their convexity and smoothness level. In particular, we show that the accelerated gradient and level methods, both originally designed for solving convex optimization problems only, can be used for solving both convex and nonconvex problems uniformly. In a similar vein, we show that some well-studied techniques for nonlinear programming, e.g., Quasi-Newton iteration, can be embedded into optimal convex optimization algorithms to possibly further enhance their numerical performance. Our theoretical and algorithmic developments are complemented by some promising numerical results obtained for solving a few important nonconvex and nonlinear data analysis problems in the literature.},
	language = {en},
	urldate = {2022-01-30},
	journal = {arXiv:1508.07384 [math, stat]},
	author = {Ghadimi, Saeed and Lan, Guanghui and Zhang, Hongchao},
	month = sep,
	year = {2015},
	note = {10 citations (Semantic Scholar/arXiv) [2023-03-03]
ZSCC: NoCitationData[s0] 
arXiv: 1508.07384},
	keywords = {/unread, Mathematics - Optimization and Control, Statistics - Machine Learning, ⛔ No DOI found},
}

Downloads: 0