Home > Research > Publications & Outputs > BOSH

Electronic data

Links

Keywords

View graph of relations

BOSH: Bayesian Optimization by Sampling Hierarchically

Research output: Contribution to conference - Without ISBN/ISSN Conference paper

Published

Standard

BOSH: Bayesian Optimization by Sampling Hierarchically. / Moss, Henry B.; Leslie, David S.; Rayson, Paul.
2020. Paper presented at Workshop on Real World Experiment Design and Active Learning at ICML 2020.

Research output: Contribution to conference - Without ISBN/ISSN Conference paper

Harvard

Moss, HB, Leslie, DS & Rayson, P 2020, 'BOSH: Bayesian Optimization by Sampling Hierarchically', Paper presented at Workshop on Real World Experiment Design and Active Learning at ICML 2020, 13/07/20 - 18/07/20. <https://arxiv.org/abs/2007.00939>

APA

Moss, H. B., Leslie, D. S., & Rayson, P. (2020). BOSH: Bayesian Optimization by Sampling Hierarchically. Paper presented at Workshop on Real World Experiment Design and Active Learning at ICML 2020. https://arxiv.org/abs/2007.00939

Vancouver

Moss HB, Leslie DS, Rayson P. BOSH: Bayesian Optimization by Sampling Hierarchically. 2020. Paper presented at Workshop on Real World Experiment Design and Active Learning at ICML 2020.

Author

Moss, Henry B. ; Leslie, David S. ; Rayson, Paul. / BOSH : Bayesian Optimization by Sampling Hierarchically. Paper presented at Workshop on Real World Experiment Design and Active Learning at ICML 2020.8 p.

Bibtex

@conference{590f6d5646e848b39432f27e8492aa41,
title = "BOSH: Bayesian Optimization by Sampling Hierarchically",
abstract = " Deployments of Bayesian Optimization (BO) for functions with stochastic evaluations, such as parameter tuning via cross validation and simulation optimization, typically optimize an average of a fixed set of noisy realizations of the objective function. However, disregarding the true objective function in this manner finds a high-precision optimum of the wrong function. To solve this problem, we propose Bayesian Optimization by Sampling Hierarchically (BOSH), a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations as the optimization progresses. We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper-parameter tuning tasks. ",
keywords = "cs.LG, stat.ML",
author = "Moss, {Henry B.} and Leslie, {David S.} and Paul Rayson",
year = "2020",
month = jul,
day = "18",
language = "English",
note = "Workshop on Real World Experiment Design and Active Learning at ICML 2020 ; Conference date: 13-07-2020 Through 18-07-2020",
url = "https://realworldml.github.io/",

}

RIS

TY - CONF

T1 - BOSH

T2 - Workshop on Real World Experiment Design and Active Learning at ICML 2020

AU - Moss, Henry B.

AU - Leslie, David S.

AU - Rayson, Paul

PY - 2020/7/18

Y1 - 2020/7/18

N2 - Deployments of Bayesian Optimization (BO) for functions with stochastic evaluations, such as parameter tuning via cross validation and simulation optimization, typically optimize an average of a fixed set of noisy realizations of the objective function. However, disregarding the true objective function in this manner finds a high-precision optimum of the wrong function. To solve this problem, we propose Bayesian Optimization by Sampling Hierarchically (BOSH), a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations as the optimization progresses. We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper-parameter tuning tasks.

AB - Deployments of Bayesian Optimization (BO) for functions with stochastic evaluations, such as parameter tuning via cross validation and simulation optimization, typically optimize an average of a fixed set of noisy realizations of the objective function. However, disregarding the true objective function in this manner finds a high-precision optimum of the wrong function. To solve this problem, we propose Bayesian Optimization by Sampling Hierarchically (BOSH), a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations as the optimization progresses. We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper-parameter tuning tasks.

KW - cs.LG

KW - stat.ML

M3 - Conference paper

Y2 - 13 July 2020 through 18 July 2020

ER -