A UNITARY TRANSFORM BASED GENERALIZED APPROXIMATE MESSAGE PASSING Jiang ZhuzXiangming MengXupeng LeizQinghua Guoy zOcean College Zhejiang University Zhoushan 316021 China

2025-04-30 0 0 361.61KB 5 页 10玖币
侵权投诉
A UNITARY TRANSFORM BASED GENERALIZED APPROXIMATE MESSAGE PASSING
Jiang ZhuXiangming Meng?Xupeng LeiQinghua Guo
Ocean College, Zhejiang University, Zhoushan, 316021, China
?Institute for Physics of Intelligence, The University of Tokyo, 7-3-1, Hongo, Tokyo 113-0033, Japan
SECTE, University of Wollongong, NSW, 2522, Australia
ABSTRACT
We consider the problem of recovering an unknown sig-
nal xRnfrom general nonlinear measurements ob-
tained through a generalized linear model (GLM), i.e., y=
f(Ax +w), where f(·)is a componentwise nonlinear func-
tion. Based on the unitary transform approximate message
passing (UAMP) and expectation propagation, a unitary
transform based generalized approximate message passing
(GUAMP) algorithm is proposed for general measurement
matrices A, in particular highly correlated matrices. Experi-
mental results on quantized compressed sensing demonstrate
that the proposed GUAMP significantly outperforms state-of-
the-art GAMP and GVAMP under correlated matrices A.
Index TermsGLM, AMP, GAMP, message passing,
quantized compressed sensing
1. INTRODUCTION
We consider the general problem of inference on generalized
linear models (GLM) [1], i.e., recovering an unknown signal
xRnfrom a noisy linear transform followed by compo-
nentwise nonlinear measurements
y=f(Ax +w),(1)
where ARm×nis a known linear mixing matrix, w
N(w; 0, σ2Im)is an i.i.d. Gaussian noise with known vari-
ance, and f(·) : Rm×1→ Qm×1is an componentwise non-
linear link function. Denote z,Ax Rmas the hidden
linear transform outputs, the componentwise nonlinear func-
tion f(·)can be equivalently described in a probabilistic way
using a fully factorized likelihood function as follows
p(y|z) =
m
Y
i=1
p(yi|zi),(2)
where p(yi|zi)is determined by the specific nonlinear func-
tion f(·). The prior distribution p(x)of xis also assumed to
be fully factorized p(x) =
n
Q
j=1
p(xj)for simplicity. GLM in-
ference has wide applications in science and engineering such
as wireless communications, signal processing, and machine
Corresponding author. E-mail: meng@g.ecc.u-tokyo.ac.jp. This work is
supported by NSFC under grant 61901415.
learning. In the special case of identity function f(·), the non-
linear GLM in (1) will reduce to the popular standard linear
model (SLM) as follows
y=Ax +w.(3)
Thus, GLM is actually an extention of SLM from linear mea-
surements to nonlinear measurements, which are prevalent in
some real-world applications such as quantized compressed
sensing, pattern classification, phase retrieval, etc.
A variety of algorithms have been proposed for inference
over GLMs and SLMs. Among them, the past decade has wit-
nessed an advent of one distinguished family of probabilistic
algorithms called message passing algorithm. Among them,
the most famous ones are the approximate message passing
(AMP) algorithm [2, 3] for SLM and generalized approxi-
mate message passing (GAMP) [4] for GLM, which have
been proved to be optimal under i.i.d. Gaussian matrices A.
However, both AMP and GAMP diverge for general A. In [5–
7], the GAMP is incorporated into the sparse Bayesian learn-
ing (SBL) to improve the robustness of GAMP and reduce the
computation complexity of SBL. Vector AMP (VAMP) [8]
(or OAMP [9], one similar algorithm to VAMP) and gener-
alized VAMP (GVAMP) [10] have been proposed to improve
the performance with general Afor SLM and GLM, respec-
tively. The VAMP is derived from expectation propagation
(EP) [11], another powerful message passing algorithm. Re-
markably, it has been demonstrated in [12, 13] that all the
AMP, GAMP, and GVAMP, can be derived concisely as spe-
cial instances of EP under different assumptions and thus be
unified within a single EP framework. Despite significant im-
provement in robustness, in particular for right-orthogonally
invariant matrices, VAMP and GVAMP still suffer poor con-
vergence in some more challenging scenarios, e.g., the mea-
surement matrix is highly correlated [14, 15].
In this work, we focus on the AMP variant: unitary
approximate message passing (UAMP) [16], which was for-
merly called UTAMP [16–18]. The motivation behind UAMP
is this: since AMP has proven to work well when the elements
of Ain (3) are uncorrelated, one can artificially constructs
such a “good” and equivalent measurement model simply
by performing a singular value decomposition (SVD) on
A=UΣVT, where URm×r,ΣRr×r,VRn×r
arXiv:2210.08861v1 [cs.IT] 17 Oct 2022
摘要:

AUNITARYTRANSFORMBASEDGENERALIZEDAPPROXIMATEMESSAGEPASSINGJiangZhuzXiangmingMeng?XupengLeizQinghuaGuoyzOceanCollege,ZhejiangUniversity,Zhoushan,316021,China?InstituteforPhysicsofIntelligence,TheUniversityofTokyo,7-3-1,Hongo,Tokyo113-0033,JapanySECTE,UniversityofWollongong,NSW,2522,AustraliaABSTRACTW...

展开>> 收起<<
A UNITARY TRANSFORM BASED GENERALIZED APPROXIMATE MESSAGE PASSING Jiang ZhuzXiangming MengXupeng LeizQinghua Guoy zOcean College Zhejiang University Zhoushan 316021 China.pdf

共5页,预览1页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:5 页 大小:361.61KB 格式:PDF 时间:2025-04-30

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 5
客服
关注