Efficient variational approximations for state space models Rub en Loaiza-Mayaand Didier Nibbering Department of Econometrics and Business Statistics Monash University

2025-05-03 0 0 3.24MB 40 页 10玖币
侵权投诉
Efficient variational approximations for state space models
Rub´en Loaiza-Mayaand Didier Nibbering
Department of Econometrics and Business Statistics, Monash University
June 5, 2023
Abstract
Variational Bayes methods are a potential scalable estimation approach for state space models.
However, existing methods are inaccurate or computationally infeasible for many state space
models. This paper proposes a variational approximation that is accurate and fast for any
model with a closed-form measurement density function and a state transition distribution
within the exponential family of distributions. We show that our method can accurately and
quickly estimate a multivariate Skellam stochastic volatility model with high-frequency tick-by-
tick discrete price changes of four stocks, and a time-varying parameter vector autoregression
with a stochastic volatility model using eight macroeconomic variables.
Keywords: State space models, Variational Bayes, Stochastic volatility, Multivariate Skellam
model, Time-varying parameter vector autoregression
JEL Classification: C11, C22, C32, C58
We would like to thank Wei Wei, Klaus Ackermann and Ashley Andrews for helpful discussions. Rub´en Loaiza-
Maya gratefully acknowledges support by the Australian Research Council through grant DE230100029.
Correspondence to: Department of Econometrics & Business Statistics, Monash University, Clayton VIC 3800,
Australia, e-mail: ruben.loaizamaya@monash.edu
arXiv:2210.11010v3 [econ.EM] 2 Jun 2023
1 Introduction
Estimation of many state space models with nonlinear and/or non-Gaussian measurement equations
is computationally challenging (Gribisch and Hartkopf, 2022; Chan, 2022; Cross et al., 2021). The
likelihood function of these models involves a high-dimensional integral with respect to the state
variables which cannot be solved analytically, and hence renders maximum likelihood estimation
to be infeasible. As an alternative, exact Bayesian estimation methods allow for the computation
of the posterior distribution of the model parameters. These methods either use particle filtering
(Chopin et al., 2020), or sample from the augmented posterior of the model parameters and the states
using analytical filtering (Carter and Kohn, 1994). Both approaches can become computationally
costly, especially with high-dimensional state vectors or when dependence between the states and
the parameters is strong (Quiroz et al., 2022).
Variational Bayes (VB) methods can provide a scalable alternative to exact Bayesian methods.
Instead of sampling exactly from the posterior, VB calibrates an approximation to the posterior
via the minimization of a divergence function. However, off-the-shelf variational methods for state
space models, such as mean-field variational approximations, are known to be poor (Wang and
Titterington, 2004). Gaussian VB methods as proposed by Tan and Nott (2018) and Quiroz et al.
(2022) use a variational family for the states that conditions on the model parameters and not on
the data. The inaccuracy of these existing methods is due to the quality of the approximation to
the conditional posterior distribution of the states (Frazier et al., 2022). More accurate VB methods
are computationally infeasible for many state space models. For instance, Tran et al. (2017) exactly
integrate out the states in the variational approximation using particle filtering. The method of
Loaiza-Maya et al. (2022) is designed for the specific class of state space models where generation
from the conditional posterior of the states is computationally feasible.
This paper proposes a novel VB method that is accurate and fast, and can be applied to a
wide range of state space models for which estimation with existing methods is either inaccurate or
computationally infeasible. Our method uses a variational approximation to the states that directly
conditions on the observed data, and as such produces an accurate approximation to the exact
posterior distribution. The approach is faster than existing VB methods for state space models
due to the computationally efficient calibration steps it entails. The implementation only requires a
measurement equation with a closed-form density representation, and a state transition distribution
that belongs to the class of exponential distributions. This allows for a wide range of state space
models, including ones with nonlinear measurement equations, certain types of nonlinear transition
equations, and high-dimensional state vectors.
Our approximation to the states is the importance density proposed by Richard and Zhang
(2007) in the context of efficient importance sampling. Hence, we refer to our method as Efficient
VB. Scharth and Kohn (2016) employ this importance distribution within a particle Markov chain
Monte Carlo (PMCMC) sampler to reduce the variance of the estimate of the likelihood function.
The use of this importance density inside PMCMC does not result in substantial computational
1
gains, as it must be recalibrated at each iteration. Since VB poses an optimization problem, we can
use stochastic gradient ascent (SGA) instead of a sampling algorithm. Our SGA algorithm requires
draws from the approximation to the states, which is used to construct an estimate of the gradient
of the objective function with respect to the parameters. Since the importance density is easy to
generate from, and it does not have to be recalibrated at each SGA step, the optimization routine
is fast and hence scalable to state space models with high-dimensional state vectors and a large
number of observations.
Numerical experiments show that the proposed Efficient VB method provides accurate posterior
densities, while it only takes a fraction of the computational cost of MCMC. The experiments employ
the stochastic volatility model as the true data generating process. Since the exact posterior can
be computed by MCMC methods, the accuracy of our method can be assessed for this model. We
find that Efficient VB produces variational approximations to the states that are close to the exact
posteriors, which result in accurate variational approximations to the parameters of the model.
Efficient VB is faster than all benchmark methods with all sample sizes under consideration.
To illustrate the contributions of our method, we apply it in two empirical applications. The
first application fits a multivariate Skellam stochastic volatility model with high-frequency tick-by-
tick discrete price changes of four stocks. With the recent availability of high frequency trading
data, the modelling of tick-by-tick price changes has become increasingly popular (Shephard and
Yang, 2017; Koopman et al., 2018; Catania et al., 2022). The model we are estimating is in the
spirit of the univariate Skellam stochastic volatility model of Koopman et al. (2017) but extended
to the multivariate setting. This state space model with a non-linear measurement equation and
a multivariate state vector cannot be accurately estimated with existing methods in a reasonable
amount of time. Efficient VB produces posterior distributions close to the exact posteriors.
The second empirical application fits a state space model with Efficient VB that can also be
estimated with existing, computationally costly, VB methods. This application fits a time-varying
parameter vector autoregression with a stochastic volatility model to eight macroeconomic variables.
This high-dimensional time series model is proposed by Huber et al. (2021), and related models are
used by for instance Clark and Ravazzolo (2015) and Carriero et al. (2019). This is a state space
model with a nonlinear measurement equation and a high-dimensional state vector. In this complex
model, our approach is accurate while the computation time is a fraction of the computation time
of the benchmark methods.
The proposed VB method has the potential to produce fast and accurate estimation for a wide
range of models. Computationally challenging state space models are currently estimated by VB
methods that are limited to specific state space formulations. For instance, Chan and Yu (2022)
and Gefang et al. (2022) propose a VB method for a specific class of vector autoregression models.
Koop and Korobilis (2018) propose a VB method for a class of time-varying parameter models.
Existing variational inference methods for state space models that construct point estimates for
model parameters, instead of posterior distributions, are computationally expensive. For instance,
2
Naesseth et al. (2018) construct an approximation to the conditional posterior of the state using
particle filtering, which is computationally costly and hence hinders the scalability to problems
with high-dimensional state vectors. Archer et al. (2015) use neural networks to construct an
approximation to the posterior of the states. The parameters of these neural networks are calibrated
jointly with the parameters of the model, which means that a high-dimensional gradient has to be
computed in each iteration of the optimization algorithm.
The outline of the remainder of this paper is as follows. Section 2 discusses specification and
exact estimation of state space models, and Section 3 develops our VB method. Section 4 conducts
numerical experiments to evaluate its accuracy and computational costs, and Section 5 and 6 apply
our method to real data. Section 7 concludes.
2 State space models
Let y= (y
1,...,y
T)be an observed time series assumed to have been generated by a state space
model with measurement and state densities
yt|(Xt=xt)p(yt|xt,θ),(1)
Xt|(Xt1=xt1)p(xt|xt1,θ),(2)
respectively, and where the prior density for X1is p(x1|θ), θΘ is a d-dimensional parameter
vector and ytis an N-dimensional observation vector with t= 1, . . . , T. The likelihood function for
this model is given by
p(y|θ) = Zp(y,x|θ)dx,(3)
where x= (x
1,...,x
T)and p(y,x|θ) = QT
t=1 p(yt|xt,θ)p(xt|xt1,θ). Typically, the integral that
characterises the likelihood is intractable, as it does not have an analytical solution. This is the
case for state space models that assume non-linear or non-Gaussian measurement equations. These
types of models are pervasive in econometrics and include, for instance, stochastic volatility models,
some time-varying parameter models, and states space models for discrete data. Hence, maximum
likelihood estimation is infeasible for a large class of econometric problems.
Bayesian analysis is concerned with computing the posterior density p(θ|y)p(y|θ)p(θ), where
p(θ) is a given choice of prior density. The intractability in the likelihood function is tackled via
two different avenues. First, for certain state space models it is feasible to use Markov chain Monte
Carlo (MCMC) to generate from the augmented density
p(θ,x|y)p(y,x|θ)p(θ),(4)
where analytical filtering methods are used to obtain draws from p(x|θ,y). MCMC effectively
3
samples from p(θ|y) which is a marginal density of p(θ,x|y). This approach is limited to certain
classes of state space models, and the filtering techniques used can become computationally costly
for large sample sizes or high-dimensional state vectors. The second Bayesian avenue generates
samples from the posterior by replacing the likelihood function by its unbiased estimate bpS(y|θ).
This unbiased estimate, evaluated via particle methods, is then used inside a Metropolis-Hastings
scheme. This approach, known as particle MCMC, trades off accuracy in estimation of p(θ|y) by
computational speed, via the choice in the number of particles S(Andrieu et al., 2010; Doucet et al.,
2015). While it can be applied to a broad class of state space models, this approach is known to be
computationally costly and highly noisy for an inadequately low number of particles. This issue is
exacerbated when the state vector is high-dimensional and a larger number of particles is required.
3 Variational Bayes
Variational Bayes may overcome the computational challenges in estimating state space models. The
general idea behind VB is to approximate the exact posterior p(θ|y) with an approximating density
qˆ
λ(θ)∈ Q, where Q={qλ(θ) : λΛ}is a class of tractable approximating densities indexed
by the variational parameter λΛ. The most popular choice for Qis the Gaussian distribution
class. The optimal variational parameter ˆ
λis then calibrated by finding the element in Qthat
minimizes the Kullback-Leibler (KL) divergence - or any other divergence - to the exact posterior.
Implementation of VB requires evaluation of the likelihood function p(y|θ), which is infeasible for
most state space models. Tran et al. (2017) circumvent this issue by replacing p(y|θ) by the unbiased
estimate bpS(y|θ). While this approach is faster than PMCMC, it remains computationally costly
due to its use of particle filtering.
VB can circumvent the computational challenges of exactly integrating out x, by instead con-
structing an approximation to the augmented posterior in (4). In this case, the approximating
density is qˆ
λ(θ,x) and Q={qλ(θ,x) : λΛ}. Then, b
λis obtained by minimising the KL di-
vergence from qλ(θ,x) to p(θ,x|y), which is equivalent to maximising the evidence lower bound
(ELBO) function L(λ) = Eqλ[log p(y,x|θ)p(θ)log qλ(θ,x)]:
ˆ
λ= argmin
λΛ
KL [qλ(θ,x)||p(θ,x|y)] = argmax
λΛL(λ).(5)
VB methods that target the augmented posterior are much faster to implement relative to methods
that approximate p(θ|y) directly.
3.1 Variational approximations for state space models
This paper proposes a variational approximation of the form
qλ(θ,x) = qλ(θ)q(x|y,θ).(6)
4
摘要:

Efficientvariationalapproximationsforstatespacemodels∗Rub´enLoaiza-Maya†andDidierNibberingDepartmentofEconometricsandBusinessStatistics,MonashUniversityJune5,2023AbstractVariationalBayesmethodsareapotentialscalableestimationapproachforstatespacemodels.However,existingmethodsareinaccurateorcomputatio...

展开>> 收起<<
Efficient variational approximations for state space models Rub en Loaiza-Mayaand Didier Nibbering Department of Econometrics and Business Statistics Monash University.pdf

共40页,预览5页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:40 页 大小:3.24MB 格式:PDF 时间:2025-05-03

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 40
客服
关注