Home > Research > Publications & Outputs > Learning Rate Free Bayesian Inference in Constr...

Electronic data

  • 4792_learning_rate_free_bayesian_in

    Accepted author manuscript, 1.03 MB, PDF document

    Embargo ends: 1/01/40

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

View graph of relations

Learning Rate Free Bayesian Inference in Constrained Domains

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Forthcoming

Standard

Learning Rate Free Bayesian Inference in Constrained Domains. / Sharrock, Louis; Mackey, Lester; Nemeth, Christopher.
In: Advances in Neural Information Processing Systems, Vol. 37, 21.09.2023.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Sharrock, L, Mackey, L & Nemeth, C 2023, 'Learning Rate Free Bayesian Inference in Constrained Domains', Advances in Neural Information Processing Systems, vol. 37. <https://openreview.net/pdf?id=TNAGFUcSP7>

APA

Vancouver

Sharrock L, Mackey L, Nemeth C. Learning Rate Free Bayesian Inference in Constrained Domains. Advances in Neural Information Processing Systems. 2023 Sept 21;37.

Author

Sharrock, Louis ; Mackey, Lester ; Nemeth, Christopher. / Learning Rate Free Bayesian Inference in Constrained Domains. In: Advances in Neural Information Processing Systems. 2023 ; Vol. 37.

Bibtex

@article{c7744a26533a4ab6bb6d24e7201488a8,
title = "Learning Rate Free Bayesian Inference in Constrained Domains",
abstract = "We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.",
author = "Louis Sharrock and Lester Mackey and Christopher Nemeth",
year = "2023",
month = sep,
day = "21",
language = "English",
volume = "37",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

RIS

TY - JOUR

T1 - Learning Rate Free Bayesian Inference in Constrained Domains

AU - Sharrock, Louis

AU - Mackey, Lester

AU - Nemeth, Christopher

PY - 2023/9/21

Y1 - 2023/9/21

N2 - We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.

AB - We introduce a suite of new particle-based algorithms for sampling on constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.

M3 - Journal article

VL - 37

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -