QuCNN A Quantum Convolutional Neural Network with Entanglement Based Backpropagation Samuel Stein1 Ying Mao2 James Ang1 and Ang Li1

2025-04-24 0 0 884.28KB 7 页 10玖币
侵权投诉
QuCNN : A Quantum Convolutional Neural
Network with Entanglement Based Backpropagation
Samuel Stein1, Ying Mao2, James Ang1, and Ang Li1
1Pacific Northwest National Laboratory
2Fordham University
Abstract—Quantum Machine Learning continues to be a highly
active area of interest within Quantum Computing. Many of these
approaches have adapted classical approaches to the quantum
settings, such as QuantumFlow, etc. We push forward this trend,
and demonstrate an adaption of the Classical Convolutional
Neural Networks to quantum systems - namely QuCNN. QuCNN
is a parameterised multi-quantum-state based neural network
layer computing similarities between each quantum filter state
and each quantum data state. With QuCNN, back propagation
can be achieved through a single-ancilla qubit quantum routine.
QuCNN is validated by applying a convolutional layer with a data
state and a filter state over a small subset of MNIST images,
comparing the backpropagated gradients, and training a filter
state against an ideal target state.
Index Terms—Quantum Computing, Quantum Machine
Learning, Convolutional Neural Networks
I. INTRODUCTION
Quantum computing is poised to provide computational
speedups that classical computing could never feasibly attain.
With continued quantum system development and applications
in domains such as Quantum Chemistry, Quantum Simulation,
and Quantum Machine Learning [1], [2], [17], [20], the
potential of Quantum Computing continues to grow. With
high level algorithm development and low level system de-
sign, we continue to improve the current state of the art.
Superconducting quantum processors of size 127 qubits have
been released by IBM [5], and 22 qubit trapped ion quantum
processors by IonQ [25], all with continued improvements in
overall qubit quality. Quantum Machine Learning has been a
field of continued interest within quantum computing, with
hopes of applications to adapting classical machine learning
success to the quantum setting. However, to really demonstrate
quantum advantage for QML, the data that is being processed
should be quantum data. For example, through the use of
QRAM or Quantum Simulation [3], [8]. Quantum Machine
Learning [6], [12], [22], [23] has borrowed motivation from
classical machine learning over the last years, attempting to
mimic the classically successful techniques through a quantum
routine (QuClassi, QuantumFlow and Convolutional Quantum
network [6], [12], [23]). Classical machine learning requires
little motivation, and has seen wide spread and ubiquitious
success [7], [18]. Within deep learning, the extremely suc-
cessful design of a Convolutional Neural Network [19] saw
Samuel.stein@pnnl.gov
Ang.li@pnnl.gov
an explosion in performance within multiple domains [11],
[16], [24]. Convolutional neural networks perform high level
feature extraction using the inner product between filters and
subsections of data which convolutionally evolve traversing
the data. In this paper, we propose a natural adaption of
classical convolutional neural network techniques to quantum
computing through the use of the SWAP test, and demonstrate
entanglement style back propagation. We numerically demon-
strate the relation, and similarity between the implementation
of QuCNN and classically implemented convolutional layers,
and finally demonstrate the learning ability of parameterised
unitary layers in learning pre-trained solution filters. In this
work, a QuCNN layer is applied to a small training sample
on MNIST, illustrating the ability to perform a convolutional
operation in the quantum setting.
II. BACKGROUND
A. Quantum Computing
Quantum computing adopts classical computing techniques
such as bit representation and combines it with quantum me-
chanical phenomena such as entanglement and superposition
[15], [21]. In classical computing, data is represented as either
1 or 0, whereas in quantum computing we represent data as
|1i,|0i, or α|0i+β|1i, a superposition of both. The coefficients
αand βrepresent the probability that the data is in the respec-
tive values state. Expanding on this, multiple bits are repre-
sented by the state |ψi=α|00...0i+β|00..1i+...+ω|11...1i,
where 2ncoefficients represent the quantum state. Finally,
quantum computing exposes the computational potential of
quantum entanglement. Quantum entanglement is most easily
understood via the CNOT gate, which transforms the quantum
state of |00i+|01iinto |00i+|11i. In this transformation, the
first bit is flipped only if the second bit is in the |1istate.
This exposes computational potential to quantum computers
and has no classical analogue.
B. Convolutional Neural Networks
Convolutional Neural Networks perform high level feature
extraction from data that exhibits spatial relations [13]. Images
are a prime example of where spatially related data exists,
where pixels provide context to pixels around them, and
hence information. The convolutional layer is characterised by
having a set of filter banks, each of some tensor shape. Each
filter independently moves and performs the inner product
arXiv:2210.05443v1 [quant-ph] 11 Oct 2022
over the data in a pre-described way. This returns a single
value representing the similarity between the filter and the
data at that location. These layers are optimised similar to
all other neural network layers, through backpropagation over
some optimization function.
III. RELATED WORK
Adapting classical machine learning techniques to quantum
systems is an active area of research, with architectures such
as Quantum Convolutional Neural Networks, QuantumFlow,
QuGAN and QuClassi.
Quantum Convolutional Neural Networks [6] - QCNN -
adapts the classical notion of spatial data encoding, and adapts
it to quantum machine learning techniques. QCNN makes use
of dual qubit unitaries and mid-circuit measurement to perform
information down pooling, to which decisions can be inferred.
This is compared with an opposite direction traversal of a
MERA network.
QuClassi [23] proposes a state based detection scheme, bor-
rowing from classical machine learning approaches of training
”weights” to represent classifier states. These states each
represent a probability of belonging to the states respective
class, which generates output layers synonymous with classical
classification network outputs.
QuantumFlow [12] attempts to mimic the transformations
undergone in classical neural networks, and attempts to accom-
plish a similar transformation as the classical y=f(xTw+b).
This is accomplished via the usage of phase flips, accumula-
tion via a hadamard gate accompanied with an entanglement
operation. QuantumFlow demonstrates the advantage of batch
normalisation, showing notable performance improvements
when normalising quantum data to reside around the XY
plane, rather than clustering around either the |1ior |0i
point. Furthermore, QuantumFlow demonstrates the reduced
parameter potential of quantum machine learning, illustrating
a quantum advantage.
Chen 2022 [?] makes use of a quantum convolutional
network to perform high energy physics data analysis. In
the paper, they present a framework for encoding localised
classical data, followed by a fully entangled parameterised
layer to perform spatial data analysis. Their numerical analysis
demonstrates the promise of quantum convolutional networks.
Notably, all of these works have taken a classical machine
learning technique, and adapted it in some form to the quan-
tum setting. We aim to accomplish the same with QuCNN,
adapting the convolutional filter operation.
IV. QUCNN
In this section, we walk through the adaption of a classi-
cal convolutional filter operation to QuCNN. We further go
on to demonstrate a quantum-implemented backpropagation
algorithm allowing for an almost entirely quantum routine to
compute the dL
igradient, where θiis the layer weight.
A. QuCNN Layer Architecture
Classical convolutional neural network’s learn a set of
feature maps for local pattern recognition over a trained data
set. This operation is characterised by the Convolution (Conv)
operation. One Convolution Operation comprises of a filter F,
and an input X, and performs Conv(X,F). This is described
by the convolution operation outlined in 1
yij =
HH
X
k=1
W W
X
l=1
wklxsi+k1,sj+l1(1)
Importantly, wx0is equivalent to the dot product between
two vectors wand x. A comparable computation is realised in
quantum computing through the SWAP test algorithm, which
computes the equation outlined in Equation 2, with error
O(1
2). Given sufficient samples, the SWAP test is an unbiased
estimator of the inner product squared.
SW AP (Q0,|ψi,|φi) = P(M|Q0i= 0) = 1
2+1
2|hψ|φi|2
(2)
Given this operation, we can perform similar convolutional
operations by performing the outlined Formula in 3, with i
filters:
yij =SW AP (|Ψii,|Xisi:si+k,sj:sj+l)(3)
where the statevector describing |Xisi:si+k,sj:sj+lhas the
same dimensionality, and hence number of qubits, of |Ψii.
The forward operation produces a similar output to a classi-
cal convolutional operation, whereby we are computing the
squared real inner product of two state vectors instead of the
inner product of two vectors.
In classical convolutional networks, the convolutional filter
is a tensor of varying activation, all of which are independently
optimised according to some loss function. With a quantum
state prepared via any ansatz, there is no way to control
one state amplitude’s magnitude without changing another
amplitude’s magnitude. This is due to the square norm re-
quirement of quantum states. Therefore, we optimise each
quantum state filter similar to the optimization procedure of a
variational quantum algorithm such as a variational quantum
eigensolver [4], [9], [10], [14]. This is visualised in Figure 1,
where n layers represents the number of parameterised layers
describing the quantum state. Each filter maintains its own
independent set of θs, where each filter attempts to learn its
own feature set.
The QuCNN architecture is applicable to purely quantum
data, that might be accessed via QRAM or other sources. How-
ever, QuCNN can operate on classical data, once the classical
data is translated into a quantum state prior to model induction
or training. Although computationally expensive, this is a pre-
processing step. Within this paper, we utilize a classical-to-
quantum encoding technique. Utilizing log2(n)encoding [12],
an input data point is broken up into spatially related data clus-
ters, with a pattern defined by parameters such as stride, filter
size etc., and translated into a group of equivalent amplitude
encoded state vectors [|Xi1,|Xi2,|Xi3, ..., |Xin]. Each filter
摘要:

QuCNN:AQuantumConvolutionalNeuralNetworkwithEntanglementBasedBackpropagationSamuelStein1,YingMao2,JamesAng1,andAngLi11PacicNorthwestNationalLaboratory2FordhamUniversityAbstract—QuantumMachineLearningcontinuestobeahighlyactiveareaofinterestwithinQuantumComputing.Manyoftheseapproacheshaveadaptedclass...

展开>> 收起<<
QuCNN A Quantum Convolutional Neural Network with Entanglement Based Backpropagation Samuel Stein1 Ying Mao2 James Ang1 and Ang Li1.pdf

共7页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:7 页 大小:884.28KB 格式:PDF 时间:2025-04-24

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 7
客服
关注