
OUTLIER-INSENSITIVE KALMAN FILTERING USING NUV PRIORS
Shunit Truzman, Guy Revach, Nir Shlezinger, and Itzik Klein
ABSTRACT
The Kalman filter (KF) is a widely-used algorithm for track-
ing the latent state of a dynamical system from noisy ob-
servations. For systems that are well-described by linear
Gaussian state space models, the KF minimizes the mean-
squared error (MSE). However, in practice, observations are
corrupted by outliers, severely impairing the KF’s perfor-
mance. In this work, an outlier-insensitive KF is proposed,
where robustness is achieved by modeling each potential out-
lier as a normally distributed random variable with unknown
variance (NUV). The NUVs variances are estimated online,
using both expectation-maximization (EM) and alternating
maximization (AM). The former was previously proposed
for the task of smoothing with outliers and was adapted here
to filtering, while both EM and AM obtained the same per-
formance and outperformed the other algorithms, the AM
approach is less complex and thus requires 40% less run-
time. Our empirical study demonstrates that the MSE of our
proposed outlier-insensitive KF outperforms previously pro-
posed algorithms, and that for data clean of outliers, it reverts
to the classic KF, i.e., MSE optimality is preserved.
Index Terms—Kalman filter, outliers, AM
1. INTRODUCTION
State estimation of dynamical systems from noisy observa-
tions plays a key role in various scientific and technological
fields such as radar target tracking, complex image process-
ing, navigation, and positioning [1]. The celebrated Kalman
filter (KF) [2] is an efficient recursive state estimation algo-
rithm that is mean-squared error (MSE) optimal for dynami-
cal systems obeying a linear Gaussian state space (SS) model.
However, the quadratic form of its objective, i.e., MSE, makes
it sensitive to deviations from nominal noise. Thus, the KF is
severely impaired when outliers are present in the measure-
ments [3,4]. As sensory data is often populated with outliers,
robustness to outliers is essential [5–7]. A common approach
for dealing with outliers is to detect and then disregard influ-
ential observations. Such detection can be achieved using ap-
propriate statistical diagnostics [8] on the posterior distribu-
S. Truzman and I. Klein are with the Hatter Dept. of Marine Technolo-
gies, University of Haifa, Haifa, Israel, (e-mail: shunitruzman@gmail.com,
kitzik@univ.haifa.ac.il ). G. Revach is with the Institute for Signal and In-
formation Processing (ISI), D-ITET, ETH Z¨
urich, Switzerland (e-mail: gre-
vach@ethz.ch). N. Shlezinger is with the School of ECE, Ben-Gurion Uni-
versity of the Negev, Beer Sheva, Israel (e-mail: nirshl@bgu.ac.il). S. Truz-
man. is supported by the Maurice Hatter Foundation. We thank Hans-Andrea
Loeliger for helpful discussions.
tion, e.g., Z-test [9] and χ2-test [10,11]. The main drawbacks
of these approaches are that they need to be carefully tuned
for a required false alarm, and that potentially useful outlier
information is not accounted for in the estimation process. Al-
ternatively, one can limit the effect of outliers by reweighting
the covariance of the observation noise at each data sample
when estimating the current state [5,6]. These techniques re-
quire careful tuning of multiple hyperparameters to operate
reliably as well. A different approach formulates the KF as a
linear regression problem, detecting outliers via a sparsifying
`1-penalty [4,12], tackled via optimization techniques, which
may be computationally complex.
In this work, a new approach for outlier insensitive
Kalman-filtering (OIKF) is proposed that leverages ideas
from sparse Bayesian learning [13]. Here, each potential
outlier is modeled as a normally distributed random variable
with unknown variance (NUV) [7,14,15], i.e., an additive
component on top of the observation noise. NUV incorpo-
ration effectively yields a modified overall sparsity-aware
objective [14], which is shown to yield a robust outlier detec-
tion statistical test with a relatively low false alarm. When
an outlier is reliably detected, we incorporate its variance
into the overall covariance matrix of the observation noise,
thereby balancing its contribution to the information fusion
(i.e., update) step in the KF, and the outlier information is
used in the state estimation process.
To estimate the NUV online, we first adapt the expectation-
maximization (EM) algorithm [16–18], which was previously
proposed for offline smoothing [7], to the online filtering task.
EM is based on computing second-order moments, i.e., the
full state observation posterior covariance. We then present an
implementation, which has not been considered to date, using
asimpler alternating maximization (AM) algorithm [19–21].
Unlike EM, AM uses only first-order moments as an empiri-
cal surrogate for outlier detection, without sacrificing perfor-
mance and with improved robustness. We evaluate the OIKF
for tracking based on the white noise acceleration (WNA) mo-
tion model with outliers [22]. We empirically demonstrate the
superiority of our proposed algorithm compared to previous
robust variants of the KF, achieving improved performances
with low complexity operations.
The rest of this paper is organized as follows: Section 2
formulates the system and the KF with outliers. Section 3
presents the OIKF with its estimation for the outlier’s vari-
ance, and Section 4empirically evaluates our methods.
arXiv:2210.06083v1 [eess.SP] 12 Oct 2022