Improving Automated Hyperparameter Optimization with Case-Based Reasoning. Hoffmann, M. & Bergmann, R. In Keane, M. T. & Wiratunga, N., editors, Case-Based Reasoning Research and Development - 30th International Conference, ICCBR 2022, Nancy, France, September 12-15, 2022, Proceedings, volume 13405, of Lecture Notes in Computer Science, pages 273–288, 2022. Springer. Paper doi abstract bibtex 21 downloads The hyperparameter configuration of machine learning models has a great influence on their performance. These hyperparameters are often set either manually w. r. t. to the experience of an expert or by an Automated Hyperparameter Optimization (HPO) method. However, integrating experience knowledge into HPO methods is challenging. Therefore, we propose the approach HypOCBR (Hyperparameter Optimization with Case-Based Reasoning) that uses Case-Based Reasoning (CBR) to improve the optimization of hyperparameters. HypOCBR is used as an addition to HPO methods and builds up a case base of sampled hyperparameter vectors with their loss values. The case base is then used to retrieve hyperparameter vectors given a query vector and to make decisions whether to proceed trialing with this query or abort and sample another vector. The experimental evaluation investigates the suitability of HypOCBR for two deep learning setups of varying complexity. It shows its potential to improve the optimization results, especially in complex scenarios with limited optimization time.
@inproceedings{hoffmann_hyperparameters_2022,
title = {{Improving Automated Hyperparameter Optimization with Case-Based Reasoning}},
author = {Maximilian Hoffmann and Ralph Bergmann},
year = 2022,
booktitle = {Case-Based Reasoning Research and Development - 30th International Conference, {ICCBR} 2022, Nancy, France, September 12-15, 2022, Proceedings},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
volume = 13405,
pages = {273--288},
doi = {10.1007/978-3-031-14923-8\_18},
url = {http://www.wi2.uni-trier.de/shared/publications/2022_ICCBR__Hyperparameter_Optimization_with_CBR.pdf},
editor = {Mark T. Keane and Nirmalie Wiratunga},
abstract = {The hyperparameter configuration of machine learning models has a great influence on their performance. These hyperparameters are often set either manually w. r. t. to the experience of an expert or by an Automated Hyperparameter Optimization (HPO) method. However, integrating experience knowledge into HPO methods is challenging. Therefore, we propose the approach HypOCBR (Hyperparameter Optimization with Case-Based Reasoning) that uses Case-Based Reasoning (CBR) to improve the optimization of hyperparameters. HypOCBR is used as an addition to HPO methods and builds up a case base of sampled hyperparameter vectors with their loss values. The case base is then used to retrieve hyperparameter vectors given a query vector and to make decisions whether to proceed trialing with this query or abort and sample another vector. The experimental evaluation investigates the suitability of HypOCBR for two deep learning setups of varying complexity. It shows its potential to improve the optimization results, especially in complex scenarios with limited optimization time.}
}
Downloads: 21
{"_id":"5swRDHsgt48rPWLgc","bibbaseid":"hoffmann-bergmann-improvingautomatedhyperparameteroptimizationwithcasebasedreasoning-2022","author_short":["Hoffmann, M.","Bergmann, R."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"Improving Automated Hyperparameter Optimization with Case-Based Reasoning","author":[{"firstnames":["Maximilian"],"propositions":[],"lastnames":["Hoffmann"],"suffixes":[]},{"firstnames":["Ralph"],"propositions":[],"lastnames":["Bergmann"],"suffixes":[]}],"year":"2022","booktitle":"Case-Based Reasoning Research and Development - 30th International Conference, ICCBR 2022, Nancy, France, September 12-15, 2022, Proceedings","publisher":"Springer","series":"Lecture Notes in Computer Science","volume":"13405","pages":"273–288","doi":"10.1007/978-3-031-14923-8_18","url":"http://www.wi2.uni-trier.de/shared/publications/2022_ICCBR__Hyperparameter_Optimization_with_CBR.pdf","editor":[{"firstnames":["Mark","T."],"propositions":[],"lastnames":["Keane"],"suffixes":[]},{"firstnames":["Nirmalie"],"propositions":[],"lastnames":["Wiratunga"],"suffixes":[]}],"abstract":"The hyperparameter configuration of machine learning models has a great influence on their performance. These hyperparameters are often set either manually w. r. t. to the experience of an expert or by an Automated Hyperparameter Optimization (HPO) method. However, integrating experience knowledge into HPO methods is challenging. Therefore, we propose the approach HypOCBR (Hyperparameter Optimization with Case-Based Reasoning) that uses Case-Based Reasoning (CBR) to improve the optimization of hyperparameters. HypOCBR is used as an addition to HPO methods and builds up a case base of sampled hyperparameter vectors with their loss values. The case base is then used to retrieve hyperparameter vectors given a query vector and to make decisions whether to proceed trialing with this query or abort and sample another vector. The experimental evaluation investigates the suitability of HypOCBR for two deep learning setups of varying complexity. It shows its potential to improve the optimization results, especially in complex scenarios with limited optimization time.","bibtex":"@inproceedings{hoffmann_hyperparameters_2022,\n\ttitle = {{Improving Automated Hyperparameter Optimization with Case-Based Reasoning}},\n\tauthor = {Maximilian Hoffmann and Ralph Bergmann},\n\tyear = 2022,\n\tbooktitle = {Case-Based Reasoning Research and Development - 30th International Conference, {ICCBR} 2022, Nancy, France, September 12-15, 2022, Proceedings},\n\tpublisher = {Springer},\n\tseries = {Lecture Notes in Computer Science},\n\tvolume = 13405,\n\tpages = {273--288},\n\tdoi = {10.1007/978-3-031-14923-8\\_18},\n\turl = {http://www.wi2.uni-trier.de/shared/publications/2022_ICCBR__Hyperparameter_Optimization_with_CBR.pdf},\n\teditor = {Mark T. Keane and Nirmalie Wiratunga},\n\tabstract = {The hyperparameter configuration of machine learning models has a great influence on their performance. These hyperparameters are often set either manually w. r. t. to the experience of an expert or by an Automated Hyperparameter Optimization (HPO) method. However, integrating experience knowledge into HPO methods is challenging. Therefore, we propose the approach HypOCBR (Hyperparameter Optimization with Case-Based Reasoning) that uses Case-Based Reasoning (CBR) to improve the optimization of hyperparameters. HypOCBR is used as an addition to HPO methods and builds up a case base of sampled hyperparameter vectors with their loss values. The case base is then used to retrieve hyperparameter vectors given a query vector and to make decisions whether to proceed trialing with this query or abort and sample another vector. The experimental evaluation investigates the suitability of HypOCBR for two deep learning setups of varying complexity. It shows its potential to improve the optimization results, especially in complex scenarios with limited optimization time.}\n}\n","author_short":["Hoffmann, M.","Bergmann, R."],"editor_short":["Keane, M. T.","Wiratunga, N."],"key":"hoffmann_hyperparameters_2022","id":"hoffmann_hyperparameters_2022","bibbaseid":"hoffmann-bergmann-improvingautomatedhyperparameteroptimizationwithcasebasedreasoning-2022","role":"author","urls":{"Paper":"http://www.wi2.uni-trier.de/shared/publications/2022_ICCBR__Hyperparameter_Optimization_with_CBR.pdf"},"metadata":{"authorlinks":{}},"downloads":21},"bibtype":"inproceedings","biburl":"https://web.wi2.uni-trier.de/publications/WI2Publikationen.bib","dataSources":["vtdjwAo6eNiqRLfnG","CnwPa99ZchEF4SZgh","nZxfXH3fRFhwWejKL","MSp3DzP4ToPojqkFy"],"keywords":[],"search_terms":["improving","automated","hyperparameter","optimization","case","based","reasoning","hoffmann","bergmann"],"title":"Improving Automated Hyperparameter Optimization with Case-Based Reasoning","year":2022,"downloads":21}