File(s) under permanent embargo
Filtering Bayesian optimization approach in weakly specified search space
journal contribution
posted on 2019-07-01, 00:00 authored by Tien Vu Nguyen, Sunil GuptaSunil Gupta, Santu RanaSantu Rana, Cheng Li, Svetha VenkateshSvetha VenkateshBayesian optimization (BO) has recently emerged as a powerful and flexible tool for hyper-parameter tuning and more generally for the efficient global optimization of expensive black-box functions. Systems implementing BO has successfully solved difficult problems in automatic design choices and machine learning hyper-parameters tunings. Many recent advances in the methodologies and theories underlying Bayesian optimization have extended the framework to new applications and provided greater insights into the behavior of these algorithms. Still, these established techniques always require a userdefined space to perform optimization. This pre-defined space specifies the ranges of hyperparameter values. In many situations, however, it can be difficult to prescribe such spaces, as a prior knowledge is often unavailable. Setting these regions arbitrarily can lead to inefficient optimization - if a space is too large, we can miss the optimum with a limited budget, on the other hand, if a space is too small, it may not contain the optimum point that we want to get. The unknown search space problem is intractable to solve in practice. Therefore, in this paper, we narrow down to consider specifically the setting of “weakly specified” search space for Bayesian optimization. By weakly specified space, we mean that the pre-defined space is placed at a sufficiently good region so that the optimization can expand and reach to the optimum. However, this pre-defined space need not include the global optimum. We tackle this problem by proposing the filtering expansion strategy for Bayesian optimization. Our approach starts from the initial region and gradually expands the search space. We develop
an efficient algorithm for this strategy and derive its regret bound. These theoretical results are complemented by an extensive set of experiments on benchmark functions and two real-world applications which demonstrate the benefits of our proposed approach.
an efficient algorithm for this strategy and derive its regret bound. These theoretical results are complemented by an extensive set of experiments on benchmark functions and two real-world applications which demonstrate the benefits of our proposed approach.
History
Journal
Knowledge and information systemsPagination
1 - 29Publisher
SpringerLocation
London, Eng.Publisher DOI
ISSN
0219-1377Language
EnglishGrant ID
ARC Australian Laureate Fellowship (FL170100006)Notes
In pressPublication classification
C Journal article; C1 Refereed article in a scholarly journalCopyright notice
2018, Springer-Verlag London Ltd., part of Springer NatureUsage metrics
Categories
Keywords
Bayesian optimizationunknown search spacehyper-parameter tuningexperimental designScience & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer Science, Information SystemsComputer ScienceGAUSSIAN-PROCESSESInformation SystemsArtificial Intelligence and Image ProcessingDistributed Computing
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC