
Probabilistic Sensitivity Framework A PREPRINT
the general application of the derivative-based sensitivity analysis can be limited by the difficulty of computing the
derivatives, the derivatives w.r.t the input distribution parameters can be more easily evaluated by differentiation inside
the expectation operator (c.f. eqs. (1) to (4) in Section 2 ). This is possible because the individual samples of the random
output are not directly dependent on the input distribution parameters. As a result, the partial derivative operation is
only evaluated w.r.t the joint probability density function (PDF) of the input, and this approach is called the likelihood
ratio/score function method (LR/SF) [3, 4].
As described, the LR/SF method is merely a mathematical trick. Nevertheless, if used together with a sampling method,
it is efficient as the uncertainty metric and its sensitivity can be evaluated in a single simulation run (c.f. Section 3.4).
The LR/SF method has been applied to general objective functions in stochastic optimization [
3
], the failure probability
in reliability engineering [5] and some distribution-free properties of the LR/SF method are discussed in [6].
The sensitivity of entropy, on the other hand, cannot be directly evaluated using the LR/SF method. Instead, sensitivity
related to entropy is often analysed using the Kullback–Leibler (K-L) divergence (aka relative entropy), by measuring
the divergence between two PDFs (probability density functions) corresponding to two different cases. This approach
is studied in [
7
] for safety assessment to explore the impact on risk profile due to input uncertainties and in [
8
] for
engineering design before and after uncertainty reduction of the random variables of interest. A similar approach using
the mutual information between the input and the output has also been studied for sensitivity analysis [
9
]. The mutual
information can be regarded as a special form of the K-L divergence except that it requires the use of the joint PDF. As
the K-L divergence is not a metric, alternative distance measures such as the Hellinger distance has been proposed to
quantify the difference between two PDFs and the corresponding sensitivities [10].
It should be noted although the relative entropy is not a metric, its infinitesimal form is directly linked to the Fisher
information [
11
] which is a metric tensor and this link has been explored in [
12
] for probabilistic sensitivity analysis
using the Fisher information matrix (FIM). The LR/SF method can then be used to compute the FIM efficiently for
sensitivity analysis of the output entropy [12].
In this paper, we propose a new sensitivity matrix
r
that unifies the sensitivity of a wide range of commonly used
uncertainty metrics, from moments of the uncertain output to the entropy of the entire distributions, in a single
framework. This is made possible by the likelihood ratio/score function method (LR/SF) where the sensitivity to the
input distribution parameters of different metrics can be expressed in the same form (c.f. Eq 4). The 2nd moment of the
sensitivity matrix,
ErrT
, arises naturally when the impact of input perturbation on the output is examined. Moreover,
the maximization of the perturbation of the output uncertainty metrics leads to an eigenvalue problem of the matrix
ErrT
. The eigenvalues represent the magnitudes of the sensitivities with respect to (w.r.t) simultaneous variations
of the input distribution parameters
b
, and the relative magnitudes and directions of the variations are given by the
corresponding eigenvectors. Therefore, the eigenvectors corresponding to the largest eigenvalues are the most sensitive
directions to guide decision-making.
The sensitivity matrix
r
can be seen as a counterpart of the deterministic sensitivity matrix (Jacobian matrix) as the
elements of
r
are the normalised partial derivatives of the output uncertainty metrics w.r.t to the distribution parameters
of the uncertain input (c.f. Eq 6). The resulting eigenvectors, therefore, have direct sensitivity interpretation. It should
be noted that although the sensitivity matrix
r
is formulated and estimated using the LR/SF method, the use of the 2nd
moment matrix and its eigenvectors for sensitivity analysis additionally captures the interactions of the sensitivities of
different metrics.
In addition, the current work is motivated by a recent study [
13
] where a special case of the proposed sensitivity matrix
has been applied successfully to the combined sensitivity analysis of multiple failure modes. We are going to show
that, not only does
ErrT
capture the combined perturbation effect of multiple metrics, e.g., multiple failure modes
or multiple moment functions, but also include the Fisher information matrix (FIM) as a special case. Application
of the FIM for sensitivity analysis can be found in many areas of science and engineering. For example, the Fisher
Information Matrix (FIM) has been applied to the parametric sensitivity study of stochastic biological systems [
14
], to
assess the most sensitive directions for climate change given a model for the present climate [
15
] and as one of the
process-tailored sensitivity metrics for engineering design [12].
It should be noted that there are two main differences between the proposed framework and the commonly used
variance-based sensitivity analysis [
16
]. First, variance-based approaches study how the variance of the output can
be decomposed into contributions from uncertain inputs. It ranks the factors based on the assumption that the factor
can be fixed to its true value, i.e., complete reduction of the uncertainties, which is rarely possible in practice [
1
]. In
contrast, the proposed framework uses partial derivatives to examine the perturbation of the output metrics due to a
variation of the input distribution parameters. As the distribution parameters are often based on data, it is equivalent
to asking which uncertain dataset the decision-makers should focus on to change the output the most. And this is
particularly pertinent to data-driven applications like digital twins [
12
]. Second, the output sensitivity measure from
2