A roadmap for edge computing enabled automated multidimensional transmission electron microscopy_2

2025-04-27 0 0 2.89MB 11 页 10玖币
侵权投诉
A roadmap for edge computing enabled automated multidimensional
transmission electron microscopy
Debangshu Mukherjee,1, a) Kevin M. Roccapriore,2Anees Al-Najjar,1Ayana Ghosh,1, 2 Jacob D. Hinkle,1Andrew R.
Lupini,2Rama K. Vasudevan,2Sergei V. Kalinin,3, 4 Olga S. Ovchinnikova,1, 5 Maxim A. Ziatdinov,1, 2 and Nageswara
S. Rao1
1)
Computational Sciences & Engineering Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831,
USA
2)
Center for Nanophase Materials Sciences, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831,
USA
3)
Department of Materials Science & Engineering, Tickle College of Engineering, University of Tennessee, Knoxville,
Tennessee 37996, USA
4)Currently at: Special Projects, Amazon Science
5)
Currently at: Division of Systems Engineering, ThermoFisher Scientic, Bothell, Washington 98021,
USA
(Dated: 7 October 2022)
The advent of modern, high-speed electron detectors has made the collection of multidimensional hyperspec-
tral transmission electron microscopy datasets, such as 4D-STEM, a routine. However, many microscopists
nd such experiments daunting since such datasets’ analysis, collection, long-term storage, and networking
remain challenging. Some common issues are the large and unwieldy size of the said datasets, oen running
into several gigabytes, non-standardized data analysis routines, and a lack of clarity about the computing
and network resources needed to utilize the electron microscope fully. However, the existing computing
and networking bottlenecks introduce signicant penalties in each step of these experiments, and thus,
real-time analysis-driven automated experimentation for multidimensional TEM is exceptionally challenging.
One solution is integrating microscopy with edge computing, where moderately powerful computational
hardware performs the preliminary analysis before handing o the heavier computation to HPC systems.
In this perspective, we trace the roots of computation in modern electron microscopy, demonstrate deep
learning experiments running on an edge system, and discuss the networking requirements for tying together
microscopes, edge computers, and HPC systems.
I. INTRODUCTION
Ernst Ruska built the rst transmission electron micro-
scope (TEM) during his doctoral studies, and it celebrates
its eightieth anniversary this year.
13
Interestingly, this
system was built and was operational less than a decade
aer experimental results from Davisson and Germer
proved de Broglie’s hypothesis to be correct about the
dual wave-particle nature of electrons.
4,5
Optical micro-
scopes inspired the rst TEM, and since then, several
new imaging modalities have been implemented, such as
electron holography,
6,7
Lorentz electron microscopy,
8
scanning TEM (STEM)
9,10
to name a few. Even with the
introduction of new imaging modalities, the electron
microscope developed by Knoll and Ruska is still remark-
ably similar to machines still in use today. However, in
a)Electronic mail: mukherjeed@ornl.gov
This manuscript has been authored by UT-Battelle, LLC under Contract
No. DE-AC05-00OR22725 with the U.S. Department of Energy. The
United States Government retains and the publisher, by accepting the
article for publication, acknowledges that the United States Govern-
ment retains a non-exclusive, paid-up, irrevocable, world-wide license
to publish or reproduce the published form of this manuscript, or al-
low others to do so, for United States Government purposes. The De-
partment of Energy will provide public access to these results of feder-
ally sponsored research in accordance with the DOE Public Access Plan
(http://energy.gov/downloads/doe-public-access-plan)
these past eight decades, TEMs have continued to gain
capabilities such as energy dispersive X-ray spectroscopy
(EDX),
11
electron energy loss spectroscopy (EELS), annu-
lar dark eld (ADF) detectors, and aberration-corrected
optics both in the TEM and STEM modes.
These advancements have allowed the S/TEM to play
a crucial role in analyzing nanometer-scale structural
phenomena in physical and life sciences. It has helped
tie together the structure-property relationships in mate-
rials systems as diverse as interfaces,
12
superlattices,
13
domain walls,
14,15
grain boundaries,
16
nanoparticles,
17
catalyst surfaces,
18
etc. This has led to the discovery of
novel applications such as phonon modes at polar vor-
tices, two-dimensional electrical liquids at oxide inter-
faces, etc.
1921
In the physical sciences, these applica-
tions have been in elds as wide as lithium-ion batteries,
catalyst systems, alloy designs to integrated electronic
circuits, to name a few.
2226
Hardware advancements
in electron microscopy, such as specialized holders for
cryogenic, heating, biasing, or liquid-cell work over the
past few decades, have also enabled nanoscale studies
of dynamic systems such as materials under mechanical
strain, thermal gradients, switching behavior in oxide fer-
roelectrics or reaction phase systems such as catalysts. So
extensive has been the advancements that microscopists
have rightly pointed out that the modern transmission
electron microscope with its capabilities for electron
arXiv:2210.02538v1 [cond-mat.mtrl-sci] 5 Oct 2022
2
imaging, electron diraction, spectroscopy, operando
studies is “A Synchrotron in a Microscope”.27,28
Along with the massive advancements in physical sci-
ences - the transmission electron microscope has been
an absolute game changer for observing biological tis-
sue. Biological TEM followed the development of the
original electron microscope closely, with Helmut Ruska,
who was Ernst Ruskas brother coincidentally, using the
TEM for imaging bacteria and viruses as early as 1939.
29
The development of cryogenic sample processing and
microscopy, combined with ultra-fast high sensitivity di-
rect electron detectors, have opened up the entire world
of biological systems such as virus cells or individual pro-
teins. As the global coronavirus pandemic continues its’
onslaught worldwide, one of the most widely circulated
electron microscopy images has been the cryo-EM image
of the SARS-CoV-2 virus.
II. THE TRANSITION TO COMPUTE-ENABLED ELECTRON
MICROSCOPY
However, as Kirkland observed, advances in mi-
croscopy and computers happened almost indepen-
dently of each other for the rst few decades of electron
microscopy, with the computation most commonly used
for simulating electron microscopy images.
30
This situa-
tion persists to a certain extent, and a signicant portion
of electron microscopy, as practiced even today, is thus of-
ten anecdotal and susceptible to operator bias. Electron
transparent samples require signicant manual input
for their preparation, and their regions of interest to be
imaged are still being decided by the microscopist. This
region is oen chosen based on the visual intuition of a
trained microscope operator and then the data collected
from the said region. While automated sample prepara-
tion and interfacing with electron microscopes are out-
side the purview of the current perspective, this current
state of aairs is a direct consequence of the fact that mi-
croscopy and its associated analysis have continuously
operated in the storage and computation-constrained
world. Thus, it fell upon a trained microscopist to choose
which areas to image and which images to analyze to be
judicious with their limitations.
Since then, several changes have made microscopy
and computation much more closely coupled with each
other over the past two decades. First, several individual
lens parameters underpinning values such as aberration
or astigmatism have been abstracted away due to the
computer-controlled operation of individual microscope
components. This abstraction was necessary due to the
advent of aberration-corrected electron optics, which
iteratively minimized lens aberrations through a combi-
nation of multiple quadrupole, hexapole, and octapole
lenses; thus, correcting individual lenses becomes tire-
some and error-prone. Second, electron microscopy has
almost entirely moved away from using photographic
lm as the data acquisition media towards electronic
CCD and CMOS-based detectors.
As a result, S/TEM alignment, operation, and data col-
lection have become signicantly automated in recent
years. The volume of data that can thus be generated
in a single day of microscopy can oen be several ter-
abytes. Current generation fast direct electron detectors
can generate tens of gigabytes of data in a minute. Simi-
larly, in situ experiments with modern detectors, which
oen have 4k pixels along a dimension, are oen run for
several hours and generate hundreds of gigabytes of data
per hour. Currently, very few microscopy facilities have
on-site computational capabilities to compress, process,
and archive such data streams from the microscopes in
real-time, let alone use that data for decision-making to
drive automated experiments. Several recent publica-
tions, notably a recent perspective by Spurgeon et al.
31
have raised the issue of the volume of data ooding out
of modern electron microscopes and the communities’
scattered responses in dealing with the issue.
Thus, there is a need for integrating on-site micro-
scope facilities with computing and storage systems, at
the local or remote edge, to form seamless ecosystems,
wherein signicant measurements collection and instru-
ment steering operations can be automated and remotely
orchestrated. In the coming sections, we will give a brief
overview of the data deluge in electron microscopy, dis-
cuss in brief current computational eorts in the eld
and elucidate the path forward for edge computing in-
frastructure for electron microscopy in the materials
community.
A. Multidimensional Electron Microscopy Enabled from
Detector Advances
Electronic detectors have been used for TEM image
acquisition since the early nineties. For a long time, such
detectors were charge-coupled devices (CCD). However,
these detectors were indirect, as the CCDs would not
record the electrons themselves. Instead, the electrons
would interact with scintillators. The scintillators would
convert electrons to photons, which would be transferred
to the detector through ber optics that coupled the de-
tectors with the scintillators. However, such a setup de-
grades the detector quantum eciency (DQE), and blurs
the detector point spread function (PSF) for electron de-
tection. Issues with scintillators are present for X-Ray
photon detection too, and thus over the last two decades,
as a replacement - semiconductor-based direct detectors
have been designed, initially for synchrotrons to detect
X-ray photons, and then subsequently for electron detec-
tion. Direct electron detectors record individual electron
impingement events without any intermediate conver-
sion to photons through scintillators and thus mitigate
the DQE and PSF issue with scintillator coupled detectors.
A side eect of direct electron detectors is that, along
with DQE, the point spread function (PSF) also improves.
This impetus for direct electron detectors was from the
3
biological cryo-EM community, where the samples are
oen highly susceptible to electron dose rates, with the
maximum allowable dose oen below 10 e/Å2.
Along with the improvement in detection capabilities,
another focus area of research has been faster detectors.
Again, this was partly driven by cryo-EM, as samples de-
grade rapidly when exposed to electrons, and thus speed
is necessary. Modern direct electron detectors employed
for 4D-STEM experiments can currently capture over
10,000 frames per second, with the camera developed at
Berkeley Lab capable of 87,000 frames per second.32
Aer speed and sensitivity, the third focus area of elec-
tron detector research is “dynamic range”. Dynamic
range refers to the ratio between the highest and lowest
electron ux that can be detected simultaneously. Of-
ten, detectors that optimize for detection at low electron
counts all the way till detection of individual electron im-
pingement events have a lower absolute dose limit. While
a detailed discussion about detector geometries is out of
the scope for this perspective, dynamic range issues can
be solved to a large extent by using hybrid-pixel array
detectors.
33
The rst such detector used for electron mi-
croscopy was the Medipix detector, which was spun out
from the detector work at the Diamond Light Source Syn-
chrotron facility in the United Kingdom.
34
The second
such eort, also originally an outcome of synchrotron
detector work, is the Electron Microscope Pixel Array De-
tector (EMPAD), developed at Cornell University.
35,36
The
EMPAD family of detectors was specically optimized
for high dynamic range (HDR), with EMPAD2 reaching a
100,000:1 range for detection. HDR detectors have advan-
tages in both EELS and 4D-STEM experiments; since, in
both cases, the ratio between scattered and unscattered
electrons may be very high.
4D-STEM datasets obtained from EMPAD detectors
have twice broken the resolution limit in electron mi-
croscopes, at 0.4Å in 2018,
37
and 0.25Å in 2021
38
- through
a technique known as electron ptychography where the
elastically scattered electron diraction patterns (the 4D-
STEM dataset) is used to solve for the microscope lens
parameters and the transfer function of the sample being
imaged. The second result reached the physically possi-
ble resolution limit before thermal vibrations from atom
columns overtake lens aberrations as the primary source
of blurring.
38
These ptychography results have demon-
strated that given enough redundancy in the collected
4D-STEM data, it is possible to completely deconvolve
the electron lens transfer functions, probe decoherence,
and positioning errors from the dataset to generate the
pure material transfer function. As a result, the nal
image quality is signicantly better than what can be ob-
tained through classical aberration-corrected ADF-STEM
imaging, with the added advantage of requiring lower
electron dose rates.39
Because of these advantages, the last two decades have
seen electron microscopes retrotted worldwide with
faster, direct electron detectors, not only for imaging
but also for EELS and 4D-STEM experiments. These ad-
vancements have made the modern STEM truly multidi-
mensional and multimodal, combining imaging, dirac-
tion and spectroscopy in a single equipment. Since then,
some of the most signicant advancements in electron
microscopy have been possible due to the advent of
high-speed direct electron detectors with DQE values
approaching unity and combined with single electron
detection sensitivity.40,41
B. Quantitative analysis from electron microscopy datasets
The advancements in microscopy hardware have made
the extraction of quantitative material information from
electron micrographs possible. Several recent open-
source soware packages have been developed by mi-
croscopists worldwide to enable this. Some exam-
ples include STEMTool,
42,43
py4DSTEM,
44
Pycroscopy,
45
PyXem,
46
pixSTEM,
47,48
LiberTEM
49
to name a few. Each
of these packages focuses on a specic area of TEM anal-
ysis, such as Pycroscopy’s focus on image processing or
py4DSTEM’s focus on 4D-STEM data analysis. The most
common area of soware development appears to be 4D-
STEM, with pixSTEM, LiberTEM, and even STEMTool fo-
cusing primarily on it. This focus on 4D-STEM is probably
driven by the fact that such datasets are oen not human
parsable, and thus computational analysis is essential to
make sense of such datasets. However, since the mod-
ern STEM is eectively a highly multimodal equipment,
many other large multi-gigabyte datasets are also rou-
tinely generated, such as spectral maps from EELS or
EDX, long-duration in-situ TEM experiments etc.
In single particle cryo-STEM, too, large datasets (sev-
eral hundred gigabytes uncompressed size) are routinely
captured before alignment and particle picking. A brief
perusal of the Electron Microscopy Public Image Archive
(EMPIAR)
50
will turn up hundreds of such datasets, each
of which individually can be several terabytes. The cryo-
EM community, however, has converged on a few open-
source solutions such as Relion
51
or commercial soware
such as cryoSPARC
52
for particle reconstruction from im-
ages, while the materials science community is more
diverse in its’ soware choices.
Along with soware development, advances in compu-
tational capabilities, including accessible CPU/GPUs, im-
plementations of algorithms, and physical models, have
led to signicant developments in the elds of compu-
tational simulations spanning over a range of time and
length scales. Therefore, using either experimental or
simulated data or both to construct Articial Intelligence
(AI) and Machine Learning (ML) based frameworks for
analyzing microscopy datasets have become common in
recent years.
While many such studies involve utilization of already
developed classication or regression algorithms, frame-
works to appropriately nd features of interest (such as
atoms, defects) from microscopic images, capturing dy-
namic behavior of the systems, and nally porting them
摘要:

AroadmapforedgecomputingenabledautomatedmultidimensionaltransmissionelectronmicroscopyDebangshuMukherjee,1,a)KevinM.Roccapriore,2AneesAl-Najjar,1AyanaGhosh,1,2JacobD.Hinkle,1AndrewR.Lupini,2RamaK.Vasudevan,2SergeiV.Kalinin,3,4OlgaS.Ovchinnikova,1,5MaximA.Ziatdinov,1,2andNageswaraS.Rao11)Computationa...

展开>> 收起<<
A roadmap for edge computing enabled automated multidimensional transmission electron microscopy_2.pdf

共11页,预览3页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:11 页 大小:2.89MB 格式:PDF 时间:2025-04-27

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 11
客服
关注