Adaptive sampling line search for local stochastic optimization with integer variables. Ragavan, P. K., Hunter, S. R., Pasupathy, R., & Taaffe, M. R. Mathematical Programming, 196:775–804, 2022. Link Paper doi abstract bibtex 20 downloads We consider optimization problems with an objective function that is estimable using a Monte Carlo oracle, constraint functions that are known deterministically through a constraint-satisfaction oracle, and integer decision variables. Seeking an appropriately defined local minimum, we propose an iterative adaptive sampling algorithm that, during each iteration, performs a local optimality test using an adaptive statistical procedure, followed by a line search executed along a stochastic descent direction. We prove a number of results. First, the true function values at the iterates generated by the algorithm form an almost-supermartingale process, and the iterates are absorbed with probability one into the set of local minima in finite time. Second, such absorption happens exponentially fast in iteration number and in oracle calls. This result is analogous to non-standard rate guarantees in stochastic continuous optimization contexts involving sharp minima. Third, the oracle complexity of the proposed algorithm increases linearly in the dimensionality of the local neighborhood. As a solver, primarily due to combining line searches that use common random numbers with statistical tests for local optimality, the proposed algorithm is effective on a variety of problems. We illustrate such performance using three problem suites, on problems ranging from 25 to 200 dimensions.
@article{2022raghunetal,
Year = {2022},
Author = {P. K. Ragavan and S. R. Hunter and R. Pasupathy and M. R. Taaffe},
Title = {Adaptive sampling line search for local stochastic optimization with integer variables},
journal = {Mathematical Programming},
volume = {196},
pages = {775--804},
doi = {10.1007/s10107-021-01667-6},
url_Link = {https://rdcu.be/ctZ5z},
url_Paper = {http://web.ics.purdue.edu/~hunter63/PAPERS/pre2021raghunetal.pdf},
abstract = {We consider optimization problems with an objective function that is estimable using a Monte Carlo oracle, constraint functions that are known deterministically through a constraint-satisfaction oracle, and integer decision variables. Seeking an appropriately defined local minimum, we propose an iterative adaptive sampling algorithm that, during each iteration, performs a local optimality test using an adaptive statistical procedure, followed by a line search executed along a stochastic descent direction. We prove a number of results. First, the true function values at the iterates generated by the algorithm form an almost-supermartingale process, and the iterates are absorbed with probability one into the set of local minima in finite time. Second, such absorption happens exponentially fast in iteration number and in oracle calls. This result is analogous to non-standard rate guarantees in stochastic continuous optimization contexts involving sharp minima. Third, the oracle complexity of the proposed algorithm increases linearly in the dimensionality of the local neighborhood. As a solver, primarily due to combining line searches that use common random numbers with statistical tests for local optimality, the proposed algorithm is effective on a variety of problems. We illustrate such performance using three problem suites, on problems ranging from 25 to 200 dimensions. },
keywords = {simulation optimization > single-objective > integer-ordered}}
Downloads: 20
{"_id":"uL3XS6zXdtQSA9pPS","bibbaseid":"ragavan-hunter-pasupathy-taaffe-adaptivesamplinglinesearchforlocalstochasticoptimizationwithintegervariables-2022","author_short":["Ragavan, P. K.","Hunter, S. R.","Pasupathy, R.","Taaffe, M. R."],"bibdata":{"bibtype":"article","type":"article","year":"2022","author":[{"firstnames":["P.","K."],"propositions":[],"lastnames":["Ragavan"],"suffixes":[]},{"firstnames":["S.","R."],"propositions":[],"lastnames":["Hunter"],"suffixes":[]},{"firstnames":["R."],"propositions":[],"lastnames":["Pasupathy"],"suffixes":[]},{"firstnames":["M.","R."],"propositions":[],"lastnames":["Taaffe"],"suffixes":[]}],"title":"Adaptive sampling line search for local stochastic optimization with integer variables","journal":"Mathematical Programming","volume":"196","pages":"775–804","doi":"10.1007/s10107-021-01667-6","url_link":"https://rdcu.be/ctZ5z","url_paper":"http://web.ics.purdue.edu/~hunter63/PAPERS/pre2021raghunetal.pdf","abstract":"We consider optimization problems with an objective function that is estimable using a Monte Carlo oracle, constraint functions that are known deterministically through a constraint-satisfaction oracle, and integer decision variables. Seeking an appropriately defined local minimum, we propose an iterative adaptive sampling algorithm that, during each iteration, performs a local optimality test using an adaptive statistical procedure, followed by a line search executed along a stochastic descent direction. We prove a number of results. First, the true function values at the iterates generated by the algorithm form an almost-supermartingale process, and the iterates are absorbed with probability one into the set of local minima in finite time. Second, such absorption happens exponentially fast in iteration number and in oracle calls. This result is analogous to non-standard rate guarantees in stochastic continuous optimization contexts involving sharp minima. Third, the oracle complexity of the proposed algorithm increases linearly in the dimensionality of the local neighborhood. As a solver, primarily due to combining line searches that use common random numbers with statistical tests for local optimality, the proposed algorithm is effective on a variety of problems. We illustrate such performance using three problem suites, on problems ranging from 25 to 200 dimensions. ","keywords":"simulation optimization > single-objective > integer-ordered","bibtex":"@article{2022raghunetal,\n\tYear = {2022},\n\tAuthor = {P. K. Ragavan and S. R. Hunter and R. Pasupathy and M. R. Taaffe},\n\tTitle = {Adaptive sampling line search for local stochastic optimization with integer variables},\n\tjournal = {Mathematical Programming},\n\tvolume = {196},\n\tpages = {775--804},\n\tdoi = {10.1007/s10107-021-01667-6}, \n\turl_Link = {https://rdcu.be/ctZ5z},\n\turl_Paper = {http://web.ics.purdue.edu/~hunter63/PAPERS/pre2021raghunetal.pdf},\n\tabstract = {We consider optimization problems with an objective function that is estimable using a Monte Carlo oracle, constraint functions that are known deterministically through a constraint-satisfaction oracle, and integer decision variables. Seeking an appropriately defined local minimum, we propose an iterative adaptive sampling algorithm that, during each iteration, performs a local optimality test using an adaptive statistical procedure, followed by a line search executed along a stochastic descent direction. We prove a number of results. First, the true function values at the iterates generated by the algorithm form an almost-supermartingale process, and the iterates are absorbed with probability one into the set of local minima in finite time. Second, such absorption happens exponentially fast in iteration number and in oracle calls. This result is analogous to non-standard rate guarantees in stochastic continuous optimization contexts involving sharp minima. Third, the oracle complexity of the proposed algorithm increases linearly in the dimensionality of the local neighborhood. As a solver, primarily due to combining line searches that use common random numbers with statistical tests for local optimality, the proposed algorithm is effective on a variety of problems. We illustrate such performance using three problem suites, on problems ranging from 25 to 200 dimensions. },\n\tkeywords = {simulation optimization > single-objective > integer-ordered}}\n\n","author_short":["Ragavan, P. K.","Hunter, S. R.","Pasupathy, R.","Taaffe, M. R."],"key":"2022raghunetal","id":"2022raghunetal","bibbaseid":"ragavan-hunter-pasupathy-taaffe-adaptivesamplinglinesearchforlocalstochasticoptimizationwithintegervariables-2022","role":"author","urls":{" link":"https://rdcu.be/ctZ5z"," paper":"http://web.ics.purdue.edu/~hunter63/PAPERS/pre2021raghunetal.pdf"},"keyword":["simulation optimization > single-objective > integer-ordered"],"metadata":{"authorlinks":{}},"downloads":20,"html":""},"bibtype":"article","biburl":"https://web.ics.purdue.edu/~hunter63/PAPERS/srhunterweb.bib","dataSources":["ZEwmdExPMCtzAbo22","PkcXzWbdqPvM6bmCx"],"keywords":["simulation optimization > single-objective > integer-ordered"],"search_terms":["adaptive","sampling","line","search","local","stochastic","optimization","integer","variables","ragavan","hunter","pasupathy","taaffe"],"title":"Adaptive sampling line search for local stochastic optimization with integer variables","year":2022,"downloads":20}