Home > Research > Publications & Outputs > GIBBON: General-purpose Information-Based Bayes...

Electronic data

  • 21-0120

    Final published version, 2.15 MB, PDF document

    Available under license: CC BY

Links

View graph of relations

GIBBON: General-purpose Information-Based Bayesian Optimisation

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

GIBBON: General-purpose Information-Based Bayesian Optimisation. / Moss, Henry B.; Leslie, David S.; Gonzalez, Javier et al.
In: Journal of Machine Learning Research, Vol. 22, No. 235, 08.10.2021, p. 1-49.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Moss, HB, Leslie, DS, Gonzalez, J & Rayson, P 2021, 'GIBBON: General-purpose Information-Based Bayesian Optimisation', Journal of Machine Learning Research, vol. 22, no. 235, pp. 1-49. <http://jmlr.org/papers/v22/21-0120.html>

APA

Vancouver

Moss HB, Leslie DS, Gonzalez J, Rayson P. GIBBON: General-purpose Information-Based Bayesian Optimisation. Journal of Machine Learning Research. 2021 Oct 8;22(235):1-49.

Author

Moss, Henry B. ; Leslie, David S. ; Gonzalez, Javier et al. / GIBBON: General-purpose Information-Based Bayesian Optimisation. In: Journal of Machine Learning Research. 2021 ; Vol. 22, No. 235. pp. 1-49.

Bibtex

@article{30a4134b51ad490dbdf422f93d5f1e46,
title = "GIBBON: General-purpose Information-Based Bayesian Optimisation",
abstract = "This paper describes a general-purpose extension of max-value entropy search, a popular approach for Bayesian Optimisation (BO). A novel approximation is proposed for the information gain -- an information-theoretic quantity central to solving a range of BO problems, including noisy, multi-fidelity and batch optimisations across both continuous and highly-structured discrete spaces. Previously, these problems have been tackled separately within information-theoretic BO, each requiring a different sophisticated approximation scheme, except for batch BO, for which no computationally-lightweight information-theoretic approach has previously been proposed. GIBBON (General-purpose Information-Based Bayesian OptimisatioN) provides a single principled framework suitable for all the above, out-performing existing approaches whilst incurring substantially lower computational overheads. In addition, GIBBON does not require the problem's search space to be Euclidean and so is the first high-performance yet computationally light-weight acquisition function that supports batch BO over general highly structured input spaces like molecular search and gene design. Moreover, our principled derivation of GIBBON yields a natural interpretation of a popular batch BO heuristic based on determinantal point processes. Finally, we analyse GIBBON across a suite of synthetic benchmark tasks, a molecular search loop, and as part of a challenging batch multi-fidelity framework for problems with controllable experimental noise.",
author = "Moss, {Henry B.} and Leslie, {David S.} and Javier Gonzalez and Paul Rayson",
year = "2021",
month = oct,
day = "8",
language = "English",
volume = "22",
pages = "1--49",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",
number = "235",

}

RIS

TY - JOUR

T1 - GIBBON: General-purpose Information-Based Bayesian Optimisation

AU - Moss, Henry B.

AU - Leslie, David S.

AU - Gonzalez, Javier

AU - Rayson, Paul

PY - 2021/10/8

Y1 - 2021/10/8

N2 - This paper describes a general-purpose extension of max-value entropy search, a popular approach for Bayesian Optimisation (BO). A novel approximation is proposed for the information gain -- an information-theoretic quantity central to solving a range of BO problems, including noisy, multi-fidelity and batch optimisations across both continuous and highly-structured discrete spaces. Previously, these problems have been tackled separately within information-theoretic BO, each requiring a different sophisticated approximation scheme, except for batch BO, for which no computationally-lightweight information-theoretic approach has previously been proposed. GIBBON (General-purpose Information-Based Bayesian OptimisatioN) provides a single principled framework suitable for all the above, out-performing existing approaches whilst incurring substantially lower computational overheads. In addition, GIBBON does not require the problem's search space to be Euclidean and so is the first high-performance yet computationally light-weight acquisition function that supports batch BO over general highly structured input spaces like molecular search and gene design. Moreover, our principled derivation of GIBBON yields a natural interpretation of a popular batch BO heuristic based on determinantal point processes. Finally, we analyse GIBBON across a suite of synthetic benchmark tasks, a molecular search loop, and as part of a challenging batch multi-fidelity framework for problems with controllable experimental noise.

AB - This paper describes a general-purpose extension of max-value entropy search, a popular approach for Bayesian Optimisation (BO). A novel approximation is proposed for the information gain -- an information-theoretic quantity central to solving a range of BO problems, including noisy, multi-fidelity and batch optimisations across both continuous and highly-structured discrete spaces. Previously, these problems have been tackled separately within information-theoretic BO, each requiring a different sophisticated approximation scheme, except for batch BO, for which no computationally-lightweight information-theoretic approach has previously been proposed. GIBBON (General-purpose Information-Based Bayesian OptimisatioN) provides a single principled framework suitable for all the above, out-performing existing approaches whilst incurring substantially lower computational overheads. In addition, GIBBON does not require the problem's search space to be Euclidean and so is the first high-performance yet computationally light-weight acquisition function that supports batch BO over general highly structured input spaces like molecular search and gene design. Moreover, our principled derivation of GIBBON yields a natural interpretation of a popular batch BO heuristic based on determinantal point processes. Finally, we analyse GIBBON across a suite of synthetic benchmark tasks, a molecular search loop, and as part of a challenging batch multi-fidelity framework for problems with controllable experimental noise.

M3 - Journal article

VL - 22

SP - 1

EP - 49

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

IS - 235

ER -