You are not logged in.

Stable bayesian optimization

Nguyen, Thanh Dai, Gupta, Sunil, Rana, Santu and Venkatesh, Svetha 2017, Stable bayesian optimization, in PAKDD 2017 : Advances in Knowledge Discovery and Data Mining : Proceedings of the 21st Pacific-Asia Conference, Springer International Publishing, Cham, Switzerland, pp. 578-591, doi: 10.1007/978-3-319-57529-2_45.

Attached Files
Name Description MIMEType Size Downloads

Title Stable bayesian optimization
Author(s) Nguyen, Thanh Dai
Gupta, SunilORCID iD for Gupta, Sunil orcid.org/0000-0002-3308-1930
Rana, SantuORCID iD for Rana, Santu orcid.org/0000-0003-2247-850X
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Conference name Knowledge Discovery and Data Mining. Pacific-Asia Conference (21st : 2017 : Jeju, South Korea)
Conference location Jeju, South Korea
Conference dates 23-26 May 2017
Title of proceedings PAKDD 2017 : Advances in Knowledge Discovery and Data Mining : Proceedings of the 21st Pacific-Asia Conference
Editor(s) Kim, Jinho
Shim, Kyuseok
Cao, Longbing
Lee, Jae-Gil
Lin, Xuemin
Moon, Yang-Sae
Publication date 2017
Series Lecture notes in artificial intelligence
Conference series Knowledge Discovery and Data Mining Pacific-Asia Conference
Start page 578
End page 591
Total pages 14
Publisher Springer International Publishing
Place of publication Cham, Switzerland
Summary Tuning hyperparameters of machine learning models is important for their performance. Bayesian optimization has recently emerged as ade-facto method for this task. The hyperparameter tuning is usually performed by looking at model performance on a validation set. Bayesian optimization is used to find the hyperparameter set corresponding to the best model performance. However, in many cases, where training or validation set has limited set of datapoints, the function representing the model performance on the validation set contains several spurious sharp peaks. The Bayesian optimization, in such cases, has a tendency to converge to sharp peaks instead of other more stable peaks. When a model trained using these hyperparameters is deployed in real world, its performance suffers dramatically. We address this problem through a novel stable Bayesian optimization framework. We construct a new acquisition function that helps Bayesian optimization to avoid the convergence to the sharp peaks. We conduct a theoretical analysis and guarantee that Bayesian optimization using the proposed acquisitionfunction prefers stable peaks over unstable ones. Experiments withsynthetic function optimization and hyperparameter tuning for SupportVector Machines show the effectiveness of our proposed framework.
ISBN 9783319575292
Language eng
DOI 10.1007/978-3-319-57529-2_45
Field of Research 080109 Pattern Recognition and Data Mining
Socio Economic Objective 0 Not Applicable
HERDC Research category E1 Full written paper - refereed
ERA Research output type E Conference publication
Copyright notice ©2017, Springer International Publishing
Persistent URL http://hdl.handle.net/10536/DRO/DU:30094582

Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 0 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 42 Abstract Views, 5 File Downloads  -  Detailed Statistics
Created: Thu, 25 May 2017, 17:34:28 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.