Deakin University
Browse

File(s) under permanent embargo

Process-constrained batch Bayesian optimisation

Prevailing batch Bayesian optimisation methods allow all control variables to be freely altered at each iteration. Real-world experiments, however, often have physical limitations making it time-consuming to alter all settings for each recommendation in a batch. This gives rise to a unique problem in BO: in a recommended batch, a set of variables that are expensive to experimentally change need to be fixed, while the remaining control variables can be varied. We formulate this as a process-constrained batch Bayesian optimisation problem. We propose two algorithms, pc-BO(basic) and pc-BO(nested). pc-BO(basic) is simpler but lacks convergence guarantee. In contrast pc-BO(nested) is slightly more complex, but admits convergence analysis. We show that the regret of pc-BO(nested) is sublinear. We demonstrate the performance of both pc-BO(basic) and pc-BO(nested) by optimising benchmark test functions, tuning hyper-parameters of the SVM classifier, optimising the heat-treatment process for an Al-Sc alloy to achieve target hardness, and optimising the short polymer fibre production process.

History

Event

Neural Information Processing Systems. Conference (2017 : 31st : Long Beach, California)

Volume

2017-December

Series

Advances in Neural Information Processing Systems

Pagination

3415 - 3424

Publisher

Neural Information Processing Systems Foundation

Location

Long Beach, California

Place of publication

[Long Beach, Calif.)

Start date

2017-12-04

End date

2017-12-09

ISSN

1049-5258

Language

eng

Publication classification

E1 Full written paper - refereed

Title of proceedings

NIPS 2017 : Proceedings of the 31st Conference of Neural Information Processing Systems

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC