Home > Research > Publications & Outputs > SGMCMCJax

Links

Text available via DOI:

View graph of relations

SGMCMCJax: a lightweight JAX library for stochastic gradient Markov chain Monte Carlo algorithms

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

SGMCMCJax: a lightweight JAX library for stochastic gradient Markov chain Monte Carlo algorithms. / Coullon, Jeremie; Nemeth, Christopher.
In: Journal of Open Source Software, Vol. 7, No. 72, 4113, 18.04.2022.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Coullon J, Nemeth C. SGMCMCJax: a lightweight JAX library for stochastic gradient Markov chain Monte Carlo algorithms. Journal of Open Source Software. 2022 Apr 18;7(72):4113. doi: 10.21105/joss.04113

Author

Bibtex

@article{6d8367a88f4e487fbf118e50c16c7154,
title = "SGMCMCJax: a lightweight JAX library for stochastic gradient Markov chain Monte Carlo algorithms",
abstract = "In Bayesian inference, the posterior distribution is the probability distribution over the model parameters resulting from the prior distribution and the likelihood. One can compute integrals over this distribution to obtain quantities of interest, such as the posterior mean and variance, or credible uncertainty regions. However, as these integrals are often intractable for problems of interest they require numerical methods to approximate them. Markov Chain Monte Carlo (MCMC) is currently the gold standard for approximating integrals needed in Bayesian inference. However, as these algorithms become prohibitively expensive for large datasets, stochastic gradient MCMC (SGMCMC) (Ma et al., 2015; Nemeth & Fearnhead, 2021) is a popular approach to approximate these integrals in these cases. This class of scalable algorithms uses data subsampling techniques to approximate gradient based sampling algorithms, and are regularly used to fit statistical models or Bayesian neural networks (BNNs). The SGMCMC literature develops new algorithms by finding novel gradient estimation techniques, designing more efficient diffusions, and finding more stable numerical discretisations to these diffusions. SGMCMCJax is a lightweight library that is designed to allow the user to innovate along these lines or use one of the existing gradient-based SGMCMC algorithms already included in the library. This makes SGMCMCJax very well suited for both research purposes and practical applications.",
author = "Jeremie Coullon and Christopher Nemeth",
year = "2022",
month = apr,
day = "18",
doi = "10.21105/joss.04113",
language = "English",
volume = "7",
journal = "Journal of Open Source Software",
issn = "2475-9066",
publisher = "The Open Journal",
number = "72",

}

RIS

TY - JOUR

T1 - SGMCMCJax

T2 - a lightweight JAX library for stochastic gradient Markov chain Monte Carlo algorithms

AU - Coullon, Jeremie

AU - Nemeth, Christopher

PY - 2022/4/18

Y1 - 2022/4/18

N2 - In Bayesian inference, the posterior distribution is the probability distribution over the model parameters resulting from the prior distribution and the likelihood. One can compute integrals over this distribution to obtain quantities of interest, such as the posterior mean and variance, or credible uncertainty regions. However, as these integrals are often intractable for problems of interest they require numerical methods to approximate them. Markov Chain Monte Carlo (MCMC) is currently the gold standard for approximating integrals needed in Bayesian inference. However, as these algorithms become prohibitively expensive for large datasets, stochastic gradient MCMC (SGMCMC) (Ma et al., 2015; Nemeth & Fearnhead, 2021) is a popular approach to approximate these integrals in these cases. This class of scalable algorithms uses data subsampling techniques to approximate gradient based sampling algorithms, and are regularly used to fit statistical models or Bayesian neural networks (BNNs). The SGMCMC literature develops new algorithms by finding novel gradient estimation techniques, designing more efficient diffusions, and finding more stable numerical discretisations to these diffusions. SGMCMCJax is a lightweight library that is designed to allow the user to innovate along these lines or use one of the existing gradient-based SGMCMC algorithms already included in the library. This makes SGMCMCJax very well suited for both research purposes and practical applications.

AB - In Bayesian inference, the posterior distribution is the probability distribution over the model parameters resulting from the prior distribution and the likelihood. One can compute integrals over this distribution to obtain quantities of interest, such as the posterior mean and variance, or credible uncertainty regions. However, as these integrals are often intractable for problems of interest they require numerical methods to approximate them. Markov Chain Monte Carlo (MCMC) is currently the gold standard for approximating integrals needed in Bayesian inference. However, as these algorithms become prohibitively expensive for large datasets, stochastic gradient MCMC (SGMCMC) (Ma et al., 2015; Nemeth & Fearnhead, 2021) is a popular approach to approximate these integrals in these cases. This class of scalable algorithms uses data subsampling techniques to approximate gradient based sampling algorithms, and are regularly used to fit statistical models or Bayesian neural networks (BNNs). The SGMCMC literature develops new algorithms by finding novel gradient estimation techniques, designing more efficient diffusions, and finding more stable numerical discretisations to these diffusions. SGMCMCJax is a lightweight library that is designed to allow the user to innovate along these lines or use one of the existing gradient-based SGMCMC algorithms already included in the library. This makes SGMCMCJax very well suited for both research purposes and practical applications.

U2 - 10.21105/joss.04113

DO - 10.21105/joss.04113

M3 - Journal article

VL - 7

JO - Journal of Open Source Software

JF - Journal of Open Source Software

SN - 2475-9066

IS - 72

M1 - 4113

ER -