Research output: Thesis › Doctoral Thesis
Research output: Thesis › Doctoral Thesis
}
TY - BOOK
T1 - General-purpose Information-theoretical Bayesian Optimisation
T2 - A thesis by acronyms
AU - Moss, Henry
PY - 2021
Y1 - 2021
N2 - Bayesian optimisation (BO) is an increasingly popular strategy for optimising functions with substantial query costs. By sequentially focusing evaluation resources into promising areas of the search space, BO is able to find reasonable solutions within heavily restricted evaluation budgets. Consequently, BO has become the de-facto approach for fine-tuning the hyper-parameters of machine learning models and has had numerous successful applications in industry and across the experimental sciences.This thesis seeks to increase the scope of information-theoretic BO, a popular class of search strategies that regularly achieves state-of-the-art optimisation. Unfortunately,current information-theoretic BO routines require sophisticated approximation schemes that incur substantially large computational overheads and are, therefore, applicable only to optimisation problems defined over low-dimensional and Euclidean search spaces. This thesis proposes information-theoretic approximations that extend theMax-value Entropy Search of Wang and Jegelka (2017) to a much wider class of optimisation tasks, including noisy, batch and multi-fidelity optimisation across both Euclidean and highly-structured discrete spaces. To comprehensively test our proposed search strategies, we construct novel frameworks for performing BO over the highly-structured string spaces that arise in synthetic gene design and molecular search problems, as well as for objective functions with controllable observation noise. Finally,we demonstrate the real-world applicability of BO as part of a sophisticated machine learning pipeline for fine-tuning multi-speaker text-to-speech models .
AB - Bayesian optimisation (BO) is an increasingly popular strategy for optimising functions with substantial query costs. By sequentially focusing evaluation resources into promising areas of the search space, BO is able to find reasonable solutions within heavily restricted evaluation budgets. Consequently, BO has become the de-facto approach for fine-tuning the hyper-parameters of machine learning models and has had numerous successful applications in industry and across the experimental sciences.This thesis seeks to increase the scope of information-theoretic BO, a popular class of search strategies that regularly achieves state-of-the-art optimisation. Unfortunately,current information-theoretic BO routines require sophisticated approximation schemes that incur substantially large computational overheads and are, therefore, applicable only to optimisation problems defined over low-dimensional and Euclidean search spaces. This thesis proposes information-theoretic approximations that extend theMax-value Entropy Search of Wang and Jegelka (2017) to a much wider class of optimisation tasks, including noisy, batch and multi-fidelity optimisation across both Euclidean and highly-structured discrete spaces. To comprehensively test our proposed search strategies, we construct novel frameworks for performing BO over the highly-structured string spaces that arise in synthetic gene design and molecular search problems, as well as for objective functions with controllable observation noise. Finally,we demonstrate the real-world applicability of BO as part of a sophisticated machine learning pipeline for fine-tuning multi-speaker text-to-speech models .
U2 - 10.17635/lancaster/thesis/1303
DO - 10.17635/lancaster/thesis/1303
M3 - Doctoral Thesis
PB - Lancaster University
ER -