File(s) under permanent embargo
Flexible transfer learning framework for bayesian optimisation
chapter
posted on 2016-04-12, 00:00 authored by T Joy, Santu RanaSantu Rana, Sunil GuptaSunil Gupta, Svetha VenkateshSvetha VenkateshBayesian optimisation is an efficient technique to optimise functions that are expensive to compute. In this paper, we propose a novel framework to transfer knowledge from a completed source optimisation task to a new target task in order to overcome the cold start problem. We model source data as noisy observations of the target function. The level of noise is computed from the data in a Bayesian setting. This enables flexible knowledge transfer across tasks with differing relatedness, addressing a limitation of the existing methods. We evaluate on the task of tuning hyperparameters of two machine learning algorithms. Treating a fraction of the whole training data as source and the whole as the target task, we show that our method finds the best hyperparameters in the least amount of time compared to both the state-of-art and no transfer method.
History
Title of book
Advances in knowledge discovery and data mining: 20th Pacific-Asia Conference, PAKDD 2016 Auckland, New Zealand, April 19-22, 2016 proceedings, part IVolume
9651Series
Lecture notes in artificial intelligence; v.9651Chapter number
9Pagination
102 - 114Publisher
SpringerPlace of publication
Berlin, GermanyPublisher DOI
ISSN
0302-9743ISBN-13
9783319317533Language
engPublication classification
B Book chapter; B1 Book chapterCopyright notice
2016, SpringerExtent
47Editor/Contributor(s)
J Bailey, L Khan, T Washio, G Dobbie, J Huang, R WangUsage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC