keeping the number of non-zero entries fixed by scaling does not lead to a converging process. For convergence it is
necessary to keep the support radius fixed, meaning that eventually the matrix Afills up again and the advantage
of sparsity and well-conditioning is lost.
A remedy to this problem has been suggested in [54, 20, 9], and the convergence has first been proven in [22,
40, 41, 63]. Recent publication comprise for example [31, 55, 64]. The idea is based on the assumption of a
sequence of increasingly denser, usually nested, data sets X1, X2,..., and a simple residual correction scheme. The
process starts when the target function is approximated on the coarsest grid X1. Next, the error of the first step
is computed on X2and then approximated using a smaller support radius of Φ and so on. This approach has not
only the advantages of sparse, well-conditioned interpolation matrices in each step and fast, stable evaluations; it
also allows us to capture different scales, if present, in the target function. We refer to this approach as multiscale.
Another computational alluring alternative to interpolation is quasi-interpolation, where the coefficients of s
of (1) are set as the values of samples, αj=vj. The quasi-interpolation concept often also requires polynomial
reproduction of a certain degree. It is used successfully in various areas and is well studied, particularly in spline
spaces, see, e.g., [11, Chapter 12] and [7]. Quasi-interpolation on scattered data is also possible, see for example
[6, 18, 58]. One popular multivariate technique is given by moving least-squares, see for example [42, 61]. However,
particular for higher approximation orders the numerical stability depends heavily on the geometry of the data sites
and is often practically not achievable.
Here, we suggest using quasi-interpolation in a residual correction scheme, that is, in a multiscale fashion. Our
quasi-interpolation consists of a compactly supported radial basis function (RBF) [60]. So the resulted scheme
provides an appealing computational technique to process massively large datasets in high dimensions. While
similar approaches were mainly studied numerically and in slightly different settings [17, 44, 8], we suggest both
numerical treatment and analytic consideration. Moreover, one of the main contributions of this paper is to carry
the concepts over to manifold-valued functions.
Manifold-valued functions are of the form F:Ω→ M, where Mis a Riemannian manifold. The simplicity
and efficiency of the quasi-interpolation when equipped with a compactly supported Φ, makes this method partic-
ularly appealing for approximating manifold-valued functions, see, e.g., [24]. The manifold expresses both a global
nonlinear structure together with local, constrained, high-dimensional elements. However, classical computational
methods for approximation cannot cope with manifold-valued functions due to manifolds’ non-linearity, and even
fundamental tasks like integration, interpolation, and regression become challenging, see, e.g., [4, 36, 66].
The problem of approximating functions with manifold values has risen in various research areas, ranging
from signal processing [5] and modern statistics [50] to essential applications such as brain networks and autism
classification [19], structural dynamics and its application for aerodynamic problem solving [2, 43], to name a
few. For example, in reduced-order models for simulations [3, 27] they drastically decrease the calculation time
of simulating processes using interpolation of nonlinear structures at scattered locations and within high levels of
accuracy [52].
In this paper, we show how to use our multiscale quasi-interpolation approach for manifold-values approximation
based on the Riemannian center of mass. First, we provide a rigorous theoretical discussion, followed by a compre-
hensive numerical study that includes both the fine details of implementation together with an illustration of the
multiscale approximation over scattered manifold data and its application. The alluring computational nature of
the quasi-interpolation multiscale becomes even more essential for approximating manifold-values functions as the
dimensionality and complexity of the manifold must be considered. In the numerical part, we provide comparison
tests between the direct quasi-interpolation approximation and the multiscale approach and show its attractivity
for an application of manifold-data processing.
The paper is organized as follows. Section 2 is dedicated to introducing essential notation, definitions, and
background. In Section 3, we prove new error bounds for quasi-interpolation of scalar-valued functions. Then,
Section 4 discusses the combination of quasi-interpolation and multiscale techniques in the scalar-valued setting.
Here, we prove a very preliminary result, which shows that multiscale quasi-interpolation converges at roughly
the same rate as quasi-interpolation itself, if the latter converges. In Section 5, we show how the construction of
the multiscale quasi-interpolation method is formed for manifold data and provide the algorithm and a theoretical
discussion. We summarize the manuscript with the numerical results that are given in Section 6. We first compare
the performance of the multiscale quasi-interpolation to standard quasi-interpolation. Here, the numerical examples
show two extraordinary features of multiscale quasi-interpolation. On the one hand they show that non-converging
quasi-interpolation can become convergent if combined with the multiscale technique. On the other hand, they
show that the speed of convergence of quasi-interpolation can be improved by combining quasi-interpolation with
the multiscale approach. There is some theoretical evidence for the first case in [21], though the situation described
there does not apply to our scattered data problems and hence needs further theoretical backing. The same is true
2