Accepted author manuscript, 647 KB, PDF document
Available under license: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Stochastic Gradient MCMC for Nonlinear State Space Models
AU - Aicher, Christopher
AU - Putcha, Srshti
AU - Nemeth, Christopher
AU - Fearnhead, Paul
AU - Fox, Emily B.
PY - 2023/6/8
Y1 - 2023/6/8
N2 - State space models (SSMs) provide a exible framework for modeling complex time series via a latent stochastic process. Inference for nonlinear, non-Gaussian SSMs is often tackled with particle methods that do not scale well to long time series. The challenge is two-fold: not only do computations scale linearly with time, as in the linear case, but particle lters additionally su erfrom increasing particle degeneracy with longer series. Stochastic gradient MCMC methods have been developed to scale Bayesian inference for nite-state hidden Markov models and linear SSMs using bu ered stochastic gradient estimates to account for temporal dependencies. We extend these stochastic gradient estimatorsto nonlinear SSMs using particle methods. We present error bounds that accountfor both bu ering error and particle error in the case of nonlinear SSMs thatare log-concave in the latent process. We evaluate our proposed particle bu eredstochastic gradient using stochastic gradient MCMC for inference on both longsequential synthetic and minute-resolution nancial returns data, demonstratingthe importance of this class of methods.
AB - State space models (SSMs) provide a exible framework for modeling complex time series via a latent stochastic process. Inference for nonlinear, non-Gaussian SSMs is often tackled with particle methods that do not scale well to long time series. The challenge is two-fold: not only do computations scale linearly with time, as in the linear case, but particle lters additionally su erfrom increasing particle degeneracy with longer series. Stochastic gradient MCMC methods have been developed to scale Bayesian inference for nite-state hidden Markov models and linear SSMs using bu ered stochastic gradient estimates to account for temporal dependencies. We extend these stochastic gradient estimatorsto nonlinear SSMs using particle methods. We present error bounds that accountfor both bu ering error and particle error in the case of nonlinear SSMs thatare log-concave in the latent process. We evaluate our proposed particle bu eredstochastic gradient using stochastic gradient MCMC for inference on both longsequential synthetic and minute-resolution nancial returns data, demonstratingthe importance of this class of methods.
M3 - Journal article
JO - Bayesian Analysis
JF - Bayesian Analysis
SN - 1936-0975
ER -