Parallel Adaptive Survivor Selection. Pei, L., Nelson, B. L., & Hunter, S. R. *Operations Research*, 2022. Paper doi abstract bibtex We reconsider the ranking & selection (R&S) problem in stochastic simulation optimization in light of high-performance, parallel computing, where we take ``R&S'' to mean any procedure that simulates all systems (feasible solutions) to provide some statistical guarantee on the selected systems. We argue that when the number of systems is very large, and the parallel processing capability is also substantial, then neither the standard statistical guarantees such as probability of correct selection, nor the usual observation-saving methods such as elimination via paired comparisons or complex budget allocation, serve the experimenter well. As an alternative, we introduce a guarantee on the expected false elimination rate that avoids the curse of multiplicity, and a method to achieve it that is designed to scale computationally with problem size and parallel computing capacity. To facilitate this approach, we present a new mathematical representation, prove small-sample and asymptotic properties, evaluate variations of the method, and demonstrate a specific implementation on a problem with over 1,100,000 systems using only 20 to 80 parallel processors. Although we focus on inference about the best system here, our parallel adaptive survivor selection (PASS) framework supports many other useful definitions of ``good'' systems.

@article{2022peinelhun,
Year = {2022},
Author = {L. Pei and B. L. Nelson and S. R. Hunter},
Title = {Parallel {A}daptive {S}urvivor {S}election},
journal = {Operations Research},
doi = {10.1287/opre.2022.2343},
url_Paper = {http://web.ics.purdue.edu/~hunter63/PAPERS/2022peinelhun.pdf},
abstract = {We reconsider the ranking \& selection (R\&S) problem in stochastic simulation optimization in light of high-performance, parallel computing, where we take ``R\&S'' to mean any procedure that simulates all systems (feasible solutions) to provide some statistical guarantee on the selected systems. We argue that when the number of systems is very large, and the parallel processing capability is also substantial, then neither the standard statistical guarantees such as probability of correct selection, nor the usual observation-saving methods such as elimination via paired comparisons or complex budget allocation, serve the experimenter well. As an alternative, we introduce a guarantee on the expected false elimination rate that avoids the curse of multiplicity, and a method to achieve it that is designed to scale computationally with problem size and parallel computing capacity. To facilitate this approach, we present a new mathematical representation, prove small-sample and asymptotic properties, evaluate variations of the method, and demonstrate a specific implementation on a problem with over 1,100,000 systems using only 20 to 80 parallel processors. Although we focus on inference about the best system here, our parallel adaptive survivor selection (PASS) framework supports many other useful definitions of ``good'' systems.},
keywords = {simulation optimization > single-objective > ranking and selection > parallel}}

Downloads: 0

{"_id":"7ZntELyhruntGTh2M","bibbaseid":"pei-nelson-hunter-paralleladaptivesurvivorselection-2022","author_short":["Pei, L.","Nelson, B. L.","Hunter, S. R."],"bibdata":{"bibtype":"article","type":"article","year":"2022","author":[{"firstnames":["L."],"propositions":[],"lastnames":["Pei"],"suffixes":[]},{"firstnames":["B.","L."],"propositions":[],"lastnames":["Nelson"],"suffixes":[]},{"firstnames":["S.","R."],"propositions":[],"lastnames":["Hunter"],"suffixes":[]}],"title":"Parallel Adaptive Survivor Selection","journal":"Operations Research","doi":"10.1287/opre.2022.2343","url_paper":"http://web.ics.purdue.edu/~hunter63/PAPERS/2022peinelhun.pdf","abstract":"We reconsider the ranking & selection (R&S) problem in stochastic simulation optimization in light of high-performance, parallel computing, where we take ``R&S'' to mean any procedure that simulates all systems (feasible solutions) to provide some statistical guarantee on the selected systems. We argue that when the number of systems is very large, and the parallel processing capability is also substantial, then neither the standard statistical guarantees such as probability of correct selection, nor the usual observation-saving methods such as elimination via paired comparisons or complex budget allocation, serve the experimenter well. As an alternative, we introduce a guarantee on the expected false elimination rate that avoids the curse of multiplicity, and a method to achieve it that is designed to scale computationally with problem size and parallel computing capacity. To facilitate this approach, we present a new mathematical representation, prove small-sample and asymptotic properties, evaluate variations of the method, and demonstrate a specific implementation on a problem with over 1,100,000 systems using only 20 to 80 parallel processors. Although we focus on inference about the best system here, our parallel adaptive survivor selection (PASS) framework supports many other useful definitions of ``good'' systems.","keywords":"simulation optimization > single-objective > ranking and selection > parallel","bibtex":"@article{2022peinelhun,\n\tYear = {2022},\n\tAuthor = {L. Pei and B. L. Nelson and S. R. Hunter},\n\tTitle = {Parallel {A}daptive {S}urvivor {S}election},\n\tjournal = {Operations Research},\n\tdoi = {10.1287/opre.2022.2343},\n\turl_Paper = {http://web.ics.purdue.edu/~hunter63/PAPERS/2022peinelhun.pdf},\n\tabstract = {We reconsider the ranking \\& selection (R\\&S) problem in stochastic simulation optimization in light of high-performance, parallel computing, where we take ``R\\&S'' to mean any procedure that simulates all systems (feasible solutions) to provide some statistical guarantee on the selected systems. We argue that when the number of systems is very large, and the parallel processing capability is also substantial, then neither the standard statistical guarantees such as probability of correct selection, nor the usual observation-saving methods such as elimination via paired comparisons or complex budget allocation, serve the experimenter well. As an alternative, we introduce a guarantee on the expected false elimination rate that avoids the curse of multiplicity, and a method to achieve it that is designed to scale computationally with problem size and parallel computing capacity. To facilitate this approach, we present a new mathematical representation, prove small-sample and asymptotic properties, evaluate variations of the method, and demonstrate a specific implementation on a problem with over 1,100,000 systems using only 20 to 80 parallel processors. Although we focus on inference about the best system here, our parallel adaptive survivor selection (PASS) framework supports many other useful definitions of ``good'' systems.},\n\tkeywords = {simulation optimization > single-objective > ranking and selection > parallel}}\n\n","author_short":["Pei, L.","Nelson, B. L.","Hunter, S. R."],"key":"2022peinelhun","id":"2022peinelhun","bibbaseid":"pei-nelson-hunter-paralleladaptivesurvivorselection-2022","role":"author","urls":{" paper":"http://web.ics.purdue.edu/~hunter63/PAPERS/2022peinelhun.pdf"},"keyword":["simulation optimization > single-objective > ranking and selection > parallel"],"metadata":{"authorlinks":{}}},"bibtype":"article","biburl":"http://web.ics.purdue.edu/~hunter63/PAPERS/srhunterweb.bib","dataSources":["PkcXzWbdqPvM6bmCx","ZEwmdExPMCtzAbo22"],"keywords":["simulation optimization > single-objective > ranking and selection > parallel"],"search_terms":["parallel","adaptive","survivor","selection","pei","nelson","hunter"],"title":"Parallel Adaptive Survivor Selection","year":2022,"downloads":5}