uncertainty on the mean seismic fragility curve due to a certain lack of knowledge of the
structure of interest and its environment (i.e including soil–structure interaction, etc.).
Since the 1980s several techniques have been developed to estimate such curves, in the
sense of mean fragility curves most of the time. When little data is available, whether experi-
mental, from post-earthquake feedback or from numerical calculations, a classic approach to
circumvent estimation difficulties is to use a parametric model of the fragility curve, such as
the lognormal model historically introduced in [1] (see e.g. [5, 6, 7, 8, 9, 10]). As the validity
of parametric models is questionable, non-parametric estimation techniques have also been
developed, such as kernel smoothing [8, 9] as well as other methodologies [10, 11]. Most of
these strategies are compared in [8, 9, 12] and [8] presents their advantages and disadvan-
tages. Beyond these methods, techniques based on statistical and machine learning on the
mechanical response of the structure can also be used, including: linear or generalized linear
regression [8], classification - based techniques [13, 14], polynomial chaos expansion [15, 16]
and artificial neural networks [17, 14]. Most of these techniques take advantage of the rise
of computational power to allow estimations based on numerical simulations. They make
it possible to reduce the computational burden which remains high because such estima-
tions require a large number of numerical simulations to be precise. Nevertheless, despite
all these techniques, one of the main challenges that persists is the estimation at a lower
numerical cost (i.e. with few calls for computer codes) of non-parametric fragility curves
taking into account the two types of uncertainties.
The objective of this work is to propose a methodology that meets these requirements in
a numerical simulation based framework. As we focus on approaches based on numerical
simulations that rely on real seismic signal databases enriched by means of a seismic signal
generator that well encompasses their temporal and spectral non-stationarities [18], we as-
sume that there is no epistemic uncertainty affecting the excitation which only represents the
aleatory uncertainty of the problem. Consequently, in our settings, epistemic uncertainties
only concern the mechanical parameters of the structures of interest. The physics-based ap-
proaches developed as part of Performance-Based Earthquake Engineering (PBEE) address
this problem [19]. However, they are not suitable when the use of detailed finite element
simulations is required, in order to take into account all the specificities of the structures
of interest as it can be the case nowadays for the seismic safety studies in nuclear industry
[5, 20, 21]. So, in this paper, our approach relies on the use of surrogate models of the com-
puter codes, also referenced as metamodels, based on Gaussian process regression. This
framework corresponds to a data driven approximation of the input/output relationship
of a numerical computer code based on a set of experiments (e.g. computer model calls)
at different values of the input parameters with a Gaussian process assumption on the nu-
merical computer code output values [22]. Gaussian process regression, or kriging in the
field of geostatistics, has gained in popularity because of its predictive capabilities and its
ability to quantify the surrogate model uncertainty [23]. Gaussian process surrogates have
already been used for various applications in engineering, such as seismic risk assessment
[24, 25], thermohydraulics for safety studies of nuclear power plants [26] or hydrogeology
for radionucleide transport in groundwater [27]. In this work, we propose a methodology
to build and calibrate a Gaussian process surrogate model to estimate a family of seismic
fragility curves for mechanical structures - defined here as seismic fragility quantile curves
- by propagating both the surrogate model uncertainty and the epistemic ones.
In such a context, the use of Sensitivity Analysis (SA) techniques is essential for engi-
neers. Indeed, according to [28], SA goal is to investigate how the uncertainty of the model
output can be apportioned to different sources of uncertainties of the model input. SA tech-
niques are also performed according to a range of conceptual objectives, coined as SA set-
tings, defined in [28, 29]. These objectives are prioritizing the most influential inputs, thus a
2