
Theorem 1.1
(
Informal, see Theorems 4.1 and 4.16
)
.
Let
ε >
0and assume
n
is sufficiently
large. For any
r∈
Ω
1
ε
, there exists an algorithm that returns a
1−1
e−O(ε)
-approximation for the
monotone submodular maximization problem under a
r
-cardinality constraint, with probability 1
−o
(1)
and query complexity Poly n, 1
εto ˜
f.
For a cardinality constraint, this paper and prior works [
11
,
20
] all achieve near-optimal approximations.
However, our result is applicable for a larger range Ω
1
ε
of cardinalities than Ω
log log n·ε−2
in [
11
].
Moreover, we only require
Poly n, 1
ε
queries to the noisy value oracle, which improves the query
complexity Ω(
n1
ε
)of [
20
]. Our main idea to address this problem is to employ a local search procedure,
whereas prior methods are all variants of the greedy algorithm. Intuitively, local search is more robust
than the greedy since the marginal functions are more sensitive to noise than the value functions.
To achieve a sufficient gain in each iteration, the greedy algorithm must identify the element with
maximum margin. In contrast, local search only needs to estimate the sets’ values. This is why we can
improve the query complexity to Poly n, 1
εfor small cardinality constraints.
We present a unified framework (Algorithm 1) for enhancing the local search to cope with noise.
One of the main differences between our framework and the non-oblivious local search proposed by [
7
]
is that we use an approximation of the auxiliary function (Definition 3.2) rather than the exact one due
to the presence of noise. We analyze the impact of the inaccuracies on the approximation performance
and query complexity of the local search (Theorem 3.3). Another difference is the construction of
the auxiliary functions used to guide the local search. We construct the auxiliary function not based
on the objective function but on smoothing surrogate functions. A surrogate function
h
needs to
meet two properties: (i)
h
should depend on an averaging set of size
poly
(
n
), such that
h
(
S
)and
its noisy analogue
˜
h
(
S
)are close for all sets
S
considered by local search; (ii)
h
needs to be close
to the original function
f
, such that optimizing
h
can yield a near-optimal solution to optimizing
f
.
However, a large averaging set is more likely to induce a large gap between the surrogate and original
function, making simultaneous fulfillment of both properties non-trivial. In this paper we carefully
design the smoothing surrogate functions as follows. For a set with size
r∈
Ω
1
ε∩On1/3
, we
define the smoothing surrogate function
h
as the expectation of
f
’s value when a random element
in
N
is added to the set (Definition 4.2). This surrogate function is robust for a relatively small
cardinality as it is based on a rather large averaging set with size nearly
n
, but too concentrated for
a large cardinality close to
n
. Thus, we consider another smoothing surrogate function
hH
for size
r∈
Ω
n1/3
, defined as the average value combined with all subsets of a certain small-size set
H
(Definition 4.17). The auxiliary functions constructed on both smoothing surrogates are shown to have
almost accurate approximations (Lemma 4.4 and 4.19). Consequently, we can apply our unified local
search framework (Algorithm 1) in both cases (Algorithm 3 and 5), and guarantee to achieve nearly
tight approximate solutions (Theorem 4.1 and 4.16). The other contribution of this paper is a constant
approximation result for maximizing monotone submodular functions with noisy oracles under general
matroid constraints.
Theorem 1.2
(
Informal, see Theorems 5.1 and 5.7
)
.
Let
ε >
0and assume
n
is sufficiently large.
For any
r∈
Ω
ε−1log(ε−1)
, there exists an algorithm that returns a
1
21−1
e−O(ε)
-approximation
for the monotone submodular maximization problem under a matroid constraint with rank
r
, with
probability 1−o(1) and query complexity at most Poly n, 1
εto ˜
f.
To the best of our knowledge, this is the first result showing that constant approximation guarantees
are obtainable under general matroid constraints in the presence of noise. To cope with noise, one
common approach for cardinality constraints is to incorporate some extra elements to gain robustness
and include these elements in the final solutions. However, for a matroid, additional elements may
undermine the independence of a set. To address this difficulty, we develop a technique for comparing
the values of independent sets in the presence of noise, which allows us to select either the local search
solutions or the additional elements for robustness and leads to an approximation ratio of 1
21−1
e.
1.2 Related work
Research has been conducted on monotone submodular maximization in the presence of noise. We say
a noisy oracle is inconsistent if it returns different answers when repeatedly queried. For inconsistent
oracles, noise often does not present a barrier to optimization, since concentration assumptions can
eliminate the noise after a sufficient number of queries [
21
,
13
]. When identical queries always obtain the
4