Bayesian optimisation is an efficient technique to optimise functions that are expensive to compute. In this paper, we propose a novel framework to transfer knowledge from a completed source optimisation task to a new target task in order to overcome the cold start problem. We model source data as noisy observations of the target function. The level of noise is computed from the data in a Bayesian setting. This enables flexible knowledge transfer across tasks with differing relatedness, addressing a limitation of the existing methods. We evaluate on the task of tuning hyperparameters of two machine learning algorithms. Treating a fraction of the whole training data as source and the whole as the target task, we show that our method finds the best hyperparameters in the least amount of time compared to both the state-of-art and no transfer method.
History
Volume
9651
Chapter number
9
Pagination
102-114
ISSN
0302-9743
ISBN-13
9783319317533
Language
eng
Publication classification
B Book chapter, B1 Book chapter
Copyright notice
2016, Springer
Extent
47
Editor/Contributor(s)
Bailey J, Khan L, Washio T, Dobbie G, Huang JZ, Wang R
Publisher
Springer
Place of publication
Berlin, Germany
Title of book
Advances in knowledge discovery and data mining: 20th Pacific-Asia Conference, PAKDD 2016 Auckland, New Zealand, April 19-22, 2016 proceedings, part I