Home > Research > Publications & Outputs > Learning Rate Free Bayesian Inference in Constr...

Electronic data

View graph of relations

Learning Rate Free Bayesian Inference in Constrained Domains

Research output: Working paperPreprint

Published

Standard

Learning Rate Free Bayesian Inference in Constrained Domains. / Sharrock, Louis; Mackey, Lester; Nemeth, Christopher.
2023.

Research output: Working paperPreprint

Harvard

APA

Vancouver

Author

Bibtex

@techreport{8ed9b88908a74e7482aaa41d82b1feba,
title = "Learning Rate Free Bayesian Inference in Constrained Domains",
abstract = "We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.",
keywords = "stat.ML, cs.LG, stat.ME",
author = "Louis Sharrock and Lester Mackey and Christopher Nemeth",
year = "2023",
month = may,
day = "24",
language = "English",
type = "WorkingPaper",

}

RIS

TY - UNPB

T1 - Learning Rate Free Bayesian Inference in Constrained Domains

AU - Sharrock, Louis

AU - Mackey, Lester

AU - Nemeth, Christopher

PY - 2023/5/24

Y1 - 2023/5/24

N2 - We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.

AB - We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.

KW - stat.ML

KW - cs.LG

KW - stat.ME

M3 - Preprint

BT - Learning Rate Free Bayesian Inference in Constrained Domains

ER -