Nuclear data activities for medium mass and heavy nuclei at Los Alamos M. R. Mumpower1T. M. Sprouse1T.Kawano1M. W. Herman1A. E. Lovell1G. W. Misch1D.Neudecker1H. Sasaki1I.Stetcu1 and P .Talou1

2025-05-06 0 0 793.45KB 7 页 10玖币
侵权投诉
Nuclear data activities for medium mass and heavy nuclei at Los Alamos
M. R. Mumpower1,,T. M. Sprouse1,T. Kawano1,M. W. Herman1,A. E. Lovell1,G. W. Misch1,D. Neudecker1,H.
Sasaki1,I. Stetcu1, and P. Talou1
1Los Alamos National Laboratory, Los Alamos, NM, 87545, USA
Abstract. Nuclear data is critical for many modern applications from stockpile stewardship to cutting edge
scientific research. Central to these pursuits is a robust pipeline for nuclear modeling as well as data assimilation
and dissemination. We summarize a small portion of the ongoing nuclear data eorts at Los Alamos for medium
mass to heavy nuclei. We begin with an overview of the NEXUS framework and show how one of its modules
can be used for model parameter optimization using Bayesian techniques. The mathematical framework aords
the combination of dierent measured data in determining model parameters and their associated correlations.
It also has the advantage of being able to quantify outliers in data. We exemplify the power of this procedure
by highlighting the recently evaluated 239Pu cross section. We further showcase the success of our tools and
pipeline by covering the insight gained from incorporating the latest nuclear modeling and data in astrophysical
simulations as part of the Fission In R-process Elements (FIRE) collaboration.
1 Introduction
Nuclear data have a profound impact on modern society
[1]. These data serve as a foundation for nuclear energy,
nuclear security, and the wide variety of research that is
carried out in nuclear astrophysics from the composition
of neutron stars to the impact of kilonova light curves [2–
5].
Despite its importance, a robust pipeline for nuclear
data assimilation and dissemination still remains on the
horizon. Currently, major evaluated databases of nuclear
reactions, decays and structure can be found in ENDF
[6] and ENSDF [7]. Inputs valuable for reaction model-
ing codes can be found in the Reference Input Parame-
ter Library (RIPL) [8] and experimental reaction data can
be found in EXFOR [9]. The Atomic Mass Evaluation
(AME) [10, 11] and NuBase [12] provide targeted infor-
mation regarding ground state and decay properties.
The disparate nature of these sources, coupled with
the diculty of extracting information, complicate the
use of these database in modern applications, particu-
larly with cutting edge science. In this contribution, we
provide a brief overview of the nuclear data activities at
Los Alamos National Laboratory (LANL). We discuss the
NEXUS framework which represents a first step towards
providing a pipeline between nuclear data, nuclear mod-
eling eorts and applications. We showcase the utility of
this platform by highlighting its use in a recent evaluation
of 239Pu cross sections as well as in the scientific Fission
In R-process Elements (FIRE) collaboration. We end with
a discussion of recent tools developed at LANL intended
to advance the use of nuclear data in astrophysical appli-
cations.
e-mail: mumpower@lanl.gov
2 NEXUS
The Los Alamos Python package, NEXUS, is a data-
agnostic framework intended to furnish access to nuclear
properties. The phrase ‘data agnostic’ means that it does
not matter how the data was generated or parsed. The out-
put format of the data is also irrelevant to the framework’s
function. Instead the focus of the code is on a consistent
object-oriented representation of various physical quanti-
ties and the relationships between them. This approach
is in stark contrast to the major databases which revolve
around the transmission format of data.
As an example of an object in NEXUS, in its simplest
form a reaction cross section may be reasonably repre-
sented by two arrays: the incident particle energy and the
cross section at each energy. Additional information may
be warranted, in which case the base object may be ex-
tended. There may be uncertainties reported on each en-
ergy point as well as uncertainties in the cross section val-
ues. For particular applications associated metadata may
also need to be aliated with attributes, for example, pro-
viding the physical units of each of the arrays. The focal
point is on the representation of the data in code, not on
how the information will be transmitted.
Information transmission of data is supported however
in NEXUS. This revolves around the concept of marshalling
which is a more general concept of object serialization
(which deals only with string representations). Each ob-
ject in NEXUS has a corresponding object, a marshaller,
which handles its transmutation to various representations.
A marshaller may translate one object type into another,
or it may convert an object from its code representation
to a form that can be saved on a computer hard disk, e.g.
ASCII or JSON. The power of this approach means that
arXiv:2210.12136v1 [nucl-th] 21 Oct 2022
complex derived objects may be constructed and converted
to the desired output format for particular applications. A
concert example is the use of atomic binding energies and
their appropriate parsing into suitable reaction network
codes for use in astrophysical [13] or Machine Learning
applications [14, 15].
The NEXUS framework provides a base set of objects
for nuclear properties that are commonly found in the
aforementioned nuclear databases. In addition, access to
a host of theoretical outputs from nuclear model codes
are also available. Among other data access methods, this
list includes, level densities, γ-strength functions, optical
model parameters, ground state binding energies, and re-
action cross sections. Information regarding nuclear iso-
mers has recently been utilized to study the impact of long-
lived excited states in astrophysics [16, 17].
The data-agnostic approach NEXUS framework also
provides a set of application programming interfaces
(APIs) to nuclear data libraries, nuclear model codes and
application codes. The transition between these three areas
enabled by this platform is depicted pictorially in Figure 1.
NEXUS3
Data Theory
Applications
Figure 1. The Los Alamos Python package, NEXUS, provides a
series of application programming interfaces (APIs) to eciently
go from nuclear theory or data to applications and back again.
3 Hierarchical Approach to Nuclear Data
Evaluations
We now highlight the power of the NEXUS framework
by showcasing a result from a recent evaluation of 239Pu
which uses the Bayesian optimization module. The eval-
uation is based on a hierarchical approach where data is
deemed most important followed by model calculation to
‘fill in the gaps’ when empirical data is lacking. This eval-
uation combines experimental cross section data from EX-
FOR along with output from the Los Alamos statistical
model code, CoH [18, 19].
In order to maintain consistency throughout each eval-
uated cross section channel, a global set of model param-
eters must be fit to available data. Model parameter fit-
ting is performed incrementally channel by channel start-
ing with the total cross section. Because most statistical
model cross sections perform very well with respect to
available data, it is possible to use a Bayesian approach
called hyperparameter optimization to optimize the model
with respect to independent datasets [20].
To optimize the calculated total cross section with re-
spect to available data, σtot, a Metropolis random-walk
algorithm is used to probe the optical model parameter
space. The Soukhovitskii potential is used as the basis
for the optical model [21, 22]. The model space consists
of six parameters, including the potential depths, diuse-
ness and radii. All other model parameters are held fixed
during this optimization as they do not influence the total
cross section. It was determined that holding the optical
model deformation fixed, rather than letting it vary dur-
ing the optimization was ideal for approximating a robust
minimum. In this procedure the chi-square goodness of fit
is minimized with respect to data which is weighted based
on its uncertainty.
The resultant parameter sensitivities of the fitting pro-
cedure are shown in Figure 2. The optical model param-
eters are standardized to unity to ensure that there is no
dierence in the scale of individual parameters. All of the
model parameters are found to exhibit a relatively strongly
peaked distribution around the optimal value. The opti-
mal values are all within 10% from the baseline parame-
ters suggested by Soukhovitskii with diuseness parame-
ters changing the most.
Figure 2. The determination of optical model parameters us-
ing Bayesian hyperparameter optimization. Shaded regions high-
light areas of minimal χ2.
The cross section fit with the optimal optical model
parameters are shown in Fig. 3. The procedure performs
nearly identically to the previous evaluation of ENDF/B-
VIII.0. Slight modifications are seen relative to ENDF/B-
VIII.0 below 30 keV where the reported uncertainties of
the datasets pull down the fit to the total cross section. A
similar modification, albeit to a much smaller eect, can
摘要:

NucleardataactivitiesformediummassandheavynucleiatLosAlamosM.R.Mumpower1;,T.M.Sprouse1,T.Kawano1,M.W.Herman1,A.E.Lovell1,G.W.Misch1,D.Neudecker1,H.Sasaki1,I.Stetcu1,andP.Talou11LosAlamosNationalLaboratory,LosAlamos,NM,87545,USAAbstract.Nucleardataiscriticalformanymodernapplicationsfromstockpilestew...

展开>> 收起<<
Nuclear data activities for medium mass and heavy nuclei at Los Alamos M. R. Mumpower1T. M. Sprouse1T.Kawano1M. W. Herman1A. E. Lovell1G. W. Misch1D.Neudecker1H. Sasaki1I.Stetcu1 and P .Talou1.pdf

共7页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:7 页 大小:793.45KB 格式:PDF 时间:2025-05-06

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 7
客服
关注