You are not logged in.

Budgeted batch Bayesian optimization

Nguyen, Vu, Rana, Santu, Gupta, Sunil, Li, Cheng and Venkatesh, Svetha 2016, Budgeted batch Bayesian optimization, in ICDM 2016: Proceedings of the 16th IEEE International Conference on Data Mining, IEEE, Piscataway, N.J., pp. 1107-1112, doi: 10.1109/ICDM.2016.0144.

Attached Files
Name Description MIMEType Size Downloads

Title Budgeted batch Bayesian optimization
Author(s) Nguyen, Vu
Rana, SantuORCID iD for Rana, Santu orcid.org/0000-0003-2247-850X
Gupta, SunilORCID iD for Gupta, Sunil orcid.org/0000-0002-3308-1930
Li, Cheng
Venkatesh, Svetha
Conference name IEEE Data Mining. International Conference (16th : 2016 : Barcelona, Spain)
Conference location Barcelona, Spain
Conference dates 12-15 Dec. 2016
Title of proceedings ICDM 2016: Proceedings of the 16th IEEE International Conference on Data Mining
Publication date 2016
Conference series IEEE Data Mining International Conference
Start page 1107
End page 1112
Total pages 6
Publisher IEEE
Place of publication Piscataway, N.J.
Keyword(s) batch Bayesian optimization
parallel global optimization
Hyper-parameter tuning
experimental design
Summary Parameter settings profoundly impact the performance of machine learning algorithms and laboratory experiments. The classical trial-error methods are exponentially expensive in large parameter spaces, and Bayesian optimization (BO) offers an elegant alternative for global optimization of black box functions. In situations where the functions can be evaluated at multiple points simultaneously, batch Bayesian optimization is used. Current batch BO approaches are restrictive in fixing the number of evaluations per batch, and this can be wasteful when the number of specified evaluations is larger than the number of real maxima in the underlying acquisition function. We present the budgeted batch Bayesian optimization (B3O) for hyper-parameter tuning and experimental design - we identify the appropriate batch size for each iteration in an elegant way. In particular, we use the infinite Gaussian mixture model (IGMM) for automatically identifying the number of peaks in the underlying acquisition functions. We solve the intractability of estimating the IGMM directly from the acquisition function by formulating the batch generalized slice sampling to efficiently draw samples from the acquisition function. We perform extensive experiments for benchmark functions and two real world applications - machine learning hyper-parameter tuning and experimental design for alloy hardening. We show empirically that the proposed B3O outperforms the existing fixed batch BO approaches in finding the optimum whilst requiring a fewer number of evaluations, thus saving cost and time.
Notes DOI updated from 10.1109/ICDM.2016.52 (on paper) to 10.1109/ICDM.2016.0144 (on website)
ISBN 9781509054725
ISSN 1550-4786
Language eng
DOI 10.1109/ICDM.2016.0144
Field of Research 080109 Pattern Recognition and Data Mining
Socio Economic Objective 0 Not Applicable
HERDC Research category E1 Full written paper - refereed
ERA Research output type E Conference publication
Copyright notice ©2016, IEEE
Persistent URL http://hdl.handle.net/10536/DRO/DU:30092000

Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 0 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 14 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Mon, 29 May 2017, 15:32:51 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.