Transcription

PHYSICAL REVIEW RESEARCH 4, 021001 (2022)PerspectiveCurrent nuclear data needs for applicationsKarolina Kolos,1 Vladimir Sobes,2 Ramona Vogt ,1,3 Catherine E. Romano,4 Michael S. Smith,5 Lee A. Bernstein,6,7David A. Brown,8 Mary T. Burkey,9 Yaron Danon,10 Mohamed A. Elsawi,11,12 Bethany L. Goldblum,6,7Lawrence H. Heilbronn,2 Susan L. Hogle,13 Jesson Hutchinson,14 Ben Loer,11 Elizabeth A. McCutchan,7Matthew R. Mumpower,15 Ellen M. O’Brien,16 Catherine Percher,17 Patrick N. Peplowski,18 Jennifer J. Ressler,9Nicolas Schunck,1 Nicholas W. Thompson,14 Andrew S. Voyles,6,7 William Wieselquist,19 and Michael Zerkle201Nuclear and Chemical Sciences Division, Lawrence Livermore National Laboratory, Livermore, California 94550, USA2Department of Nuclear Engineering, University of Tennessee, Knoxville, Tennessee 37996, USA3Department of Physics and Astronomy, University of California, Davis, California 95516, USA4IB3 Global Solutions, Oak Ridge, Tennessee 37830, USA5Physics Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831, USA6University of California, Department of Nuclear Engineering, Berkeley, California 94720, USA7Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA8National Nuclear Data Center, Brookhaven National Laboratory, Upton, New York 11973, USA9Design Physics Division, Lawrence Livermore National Laboratory, Livermore, California 94550, USA10Rensselaer Polytechnic Institute, Department of Mechanical, Aerospace, and Nuclear Engineering,110 8th Street, Troy, New York 12180, USA11Signature Science and Technology Division, Pacific Northwest National Laboratory, Richland, Washington 99352, USA12Xe-Mobile Division, X-Energy, LLC, Rockville, Maryland 20852, USA13Radioisotope Science and Technology Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831, USA14NEN-2 Advanced Nuclear Technology, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA15Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA16Chemistry Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA17Nuclear Criticality Safety Division, Lawrence Livermore National Laboratory, Livermore, California 94550, USA18Johns Hopkins University Applied Physics Laboratory, Laurel, Maryland 20723, USA19Nuclear Energy and Fuel Cycle Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831, USA20Reactor Technology Department, Naval Nuclear Laboratory, West Mifflin, Pennsylvania 15122, USA(Received 9 September 2021; published 18 April 2022)Accurate nuclear data provide an essential foundation for advances in a wide range of fields, including nuclearenergy, nuclear safety and security, safeguards, nuclear medicine, and planetary and space exploration. In theseand other critical domains, outdated, imprecise, and incomplete nuclear data can hinder progress, limit precision,and compromise safety. Similar nuclear data needs are shared by many applications, thus prioritizing theseneeds is especially important and urgently needed. Many levels of analysis are required to prepare nuclearmeasurements for employment in end-user applications. Because research expertise is typically limited to onelevel, collaboration across organizations and international borders is essential. This perspective piece providesthe latest advances in nuclear data for applications and describes an outlook for both near- and long-term progressin the field.DOI: 10.1103/PhysRevResearch.4.021001I. INTRODUCTIONNuclear data provide the empirical foundation for studiesin nuclear energy, safety, security, safeguards, and many otherareas of science. They also, however, enable applications thatPublished by the American Physical Society under the terms of theCreative Commons Attribution 4.0 International license. Furtherdistribution of this work must maintain attribution to the author(s)and the published article’s title, journal citation, and DOI.2643-1564/2022/4(2)/021001(34)021001-1touch many aspects of everyday life. In homes, food is servedthat has been irradiated by 60 Co and 137 Cs to kill bacteria,making it safe for long term storage and distribution. 63 Ni isused in camera light sensors and long-life batteries. Smoke detectors utilizing 241 Am warn of household fires. Some peoplehave benefited from medical scans using 99 Tc, while othersfrom artificial joints that have been wear-tested with 7 Be.Nuclear data enable technologies that helps protect theenvironment. 65 Zn and 54 Mn are used to understand the flowof heavy metal contaminants in mining wastewater, 3 H isused to study sewage and liquid waste, 137 Cs helps track soilerosion and deposition, and neutron radiography is used toPublished by the American Physical Society

KAROLINA KOLOS et al.PHYSICAL REVIEW RESEARCH 4, 021001 (2022)characterize contaminants in a variety of environmental samples. Nuclear data also help characterize and modify materialsfor many uses, from 60 Co used to probe the suitability andlongevity of cement storage casks and reactor walls, to neutron scattering used to probe the unusual magnetic propertiesof quantum materials, and to selenium ion implantations forsemiconductor doping.Finally, nuclear data also provide insight into understanding the world. These data are critical in Nobel prize-winningdiscoveries like the Higgs boson, because they determine howparticles interact with the detectors. Radioactive dating with14C provides deeper understanding of history. Heavy ion irradiation is used to understand how computer chips will survivecosmic ray bombardment, thereby enabling safe space travel.And the Voyager deep space exploration crafts have beenpowered by 238 Pu since their launch in 1977.Nuclear data encompass a wide range of structure and reaction quantities such as scattering and reaction cross sections,generally given as a function of energy and angle; nuclearmasses, level properties, and their decay modes and parameters; neutron and photon spectra from reactions; and manyothers. These are needed for every nuclear isotope and theirreactions involving neutrons, protons, deuterons, alphas, andphotons. Where measured data do not exist, model predictionsare sometimes used as placeholders.Nuclear data collections, or libraries, incorporate data frommultiple sources that have been critically assessed by a nuclear data evaluators who review and combine all availabledata sets, determine the highest quality data, and decide upona set of standards. These evaluations are stored in specificevaluated nuclear data files consisting of a combination oftabulated data and parameters that can be reconstructed intodata sets using specially designed processing codes.Because of the complexity and diverse skills required forevaluation work, and because of the importance of nucleardata across both basic and applied fields, numerous organizations have been formed to coordinate activities, increasecommunication, and launch collaborative efforts betweenevaluators. For reactions, these include the US National Nuclear Data Center (NNDC) [1], International Network ofNuclear Reaction Data Centers (NRDC) [2] and the International Nuclear Data Evaluator Network [3], under the auspicesof the Nuclear Data Service (NDS) of International AtomicEnergy Agency (IAEA) [4]; the Cross Section EvaluationWorking Group (CSEWG) [5] bringing together the effortsof US and Canadian national laboratories; the Organizationfor Economic Cooperation and Development (OECD) Nuclear Energy Agency (NEA) Working Party on InternationalNuclear Data Evaluation Cooperation (WPEC) [6]; and theJapanese Nuclear Data Committee [7]. These organizationshave, for example, helped address differences between nuclearreaction data sets in the US [Evaluated Nuclear Data File(ENDF)] [8], Japan [Japanese Evaluated Nuclear Data Library(JENDL)] [9], Europe (the Joint Evaluated Fission and Fusion File (JEFF) [10], TALYS-based Evaluated Nuclear DataLibrary (TENDL) [11]), Russia [Russian Evaluated NuclearData Library (BROND)] [12], and China [Chinese Evaluated Nuclear Data Library (CENDL)] [13]. Additionally,WPEC subgroups [6] and IAEA Nuclear Data Section Coordinated Research Projects [14] have worked to improve dataFIG. 1. Schematic example of a linear data pipeline showing howthe components of the pipeline contribute to the production of datalibraries used by applications. Given the multidisciplinary nature ofnuclear data, some activities may involve more than one component.evaluation techniques, data formats, and general and userspecific evaluated data sets. For nuclear structure, evaluationwork is coordinated by the US Nuclear Data Program [15]managed by the US NNDC, the IAEA NDS [4], and theNuclear Structure and Decay Data (NSDD) Network [16]. Cooperation between organizations leads to better, more reliableinternational nuclear data evaluations and standards.Evaluated nuclear data sets are critical inputs for predictive modeling and simulations in various applied scienceand engineering disciplines. Nuclear power and associatedfuel cycle operations, national security and nonproliferationapplications, shielding studies, materials analysis, medical radioisotope production, diagnosis and radiotherapy, and spaceapplications are only a handful of applications that rely onaccurate and precise nuclear data. In many cases, the same nuclear data sets can provide cross-cutting support to a numberof different applications.Extensive experimental campaigns to measure nuclear datawere made from the 1950s to the 1990s. Since then, computational modeling and simulations of nuclear systems haveundergone a period of rapid expansion. The computationalpower available for detailed modeling of physical systemshas grown by several orders of magnitude. Consequently,the predictive power of many simulations such as radiationtransport codes is effectively limited by the fidelity of theinput nuclear data. The limits of this predictive power haveeconomic, safety, and security consequences that must beaddressed. For example, safeguards and homeland securityapplications rely on hybrid methods of radiation detectionand computational solutions of the inverse radiation transportproblem. If modeling of these systems is limited by nucleardata, the ability to detect smuggled nuclear materials, forexample, is correspondingly limited.A. The nuclear data pipelineThe nuclear data pipeline, shown in Fig. 1, is a termused to describe the many interconnected steps required toprepare nuclear measurement results for use in end-user applications. While this pipeline has been described in numerousways, there are, in general, six essential steps: measurements,compilation, evaluation, processing, validation, and applications. Measurements are made, both for fundamental scienceand for specific user-related requests. Compilation involvescollecting the data from new measurements and historicalliterature and inserting these data and related informationfrom measurements into both bibliographic databases (suchas Nuclear Science Reference (NSR) [17] and ComputerIndex of Nuclear Reaction Data (CINDA) [18]) and numerical databases (including Experimental Unevaluated Nuclear021001-2

CURRENT NUCLEAR DATA NEEDS FOR APPLICATIONSPHYSICAL REVIEW RESEARCH 4, 021001 (2022)Data List (XUNDL) [19] and Experimental Nuclear Reaction Data (EXFOR) [20]). The next step, evaluation, iscritical to provide a recommended best value for all piecesof nuclear data by expertly combining new measurementswith previous measurements and nuclear model predictions.Evaluated nuclear reaction data is inserted into ENDF [8]and disseminated online by a variety of tools includingSigma [21] from the US NNDC [1] and ZVVIEW [22] fromIAEA-NDS [4]. Evaluated nuclear structure and decay dataare inserted into the Evaluated Nuclear Structure Data File(ENSDF) [23] and disseminated online via NuDat [24] fromthe NNDC.Processing is the fourth step of the pipeline, wherein evaluated data sets are converted to formats required by specificend-user applications. In some cases, these processed data setsare distributed to the community, such as the Nuclear WalletCards [25] and the Medical Internal Radiation Dose (MIRD)database [26]. In other cases, evaluated files are processed andstored on local computers and serve as input files for end-usersimulations. For example, NJOY [27] and other codes (e.g.,NECP-ATLAS [28]) are used to process the ENDF evaluateddata file into the ACE format [29] for input to transport codessuch as the Monte Carlo N-particle transport code (MCNP)[30]. Validation, the next step in the pipeline for reactiondatabases, involves quantitative model comparisons [31–35]with independently measured values from benchmark-qualityexperiments such as for criticality safety [36], employingthe newly processed evaluated data as input. Iterative adjustments are made to reaction evaluations on the basis ofthis validation process. Finally, the processed and validatednuclear data files are disseminated for use in applications.New applications or more stringent requirements for existingapplications could require new data, starting the flow of thepipeline again.The lengthy passage of data through the full pipeline, fromnew experimental measurements through evaluation, processing, and validation, requires expertise at each step. For nuclearstructure data, all nuclides of a particular mass number areoften evaluated simultaneously, because their levels are interconnected by beta decays. Such mass chain evaluations cantake half a year to two years to complete, containing all properties and decays of 105 levels, and then up to another yearor two for critical peer review and quality assurance checks.Additionally, many nuclides are evaluated individually. Theaverage time between evaluations, currently approximately7 years, is limited by the available evaluation workforce. Uponcompletion, evaluations are entered into the ENSDF databasein a process of continual updates. For nuclear reaction data,the ENDF database [8] is organized into 15 sublibraries (e.g.,“Neutron for neutron-induced reactions), and further subdivided into evaluations of each isotope where all reactionchannels are simultaneously evaluated. In some cases, individual reaction channels (e.g., partial cross sections) with over 106 data points require months to years to complete, anda full evaluation for the nuclide in a sub-library can takesignificantly longer. Some of the processing and validationsteps have been recently automated [8], as well as a newrelease of the full ENDF library, that includes all evaluationscompleted since the last release, is made approximately every5 years.B. Key topical areas in nuclear dataIn 2021, six topics were selected collectively by the nuclear data producer and user communities [37] that bestreflect deficiencies or opportunities relevant for current andemerging applications with crosscutting themes that enablesupport of the data pipeline for multiple programs [38–41].(See Ref. [42] for more information.) In the remainder ofthis section, each of the six topics are briefly introduced. Thefollowing sections will discuss the highlights and outcomes ofeach topical discussion in more detail, with specific outlookshighlighting the most urgent nuclear data needs in each area.1. Advanced computing for nuclear dataComputing plays a critical role in applied nuclear data,ranging from execution of high-fidelity physics models thatform the backbone of data evaluations and experimental analysis and interpretation, propagating uncertainties through acomplex chain of heterogeneous codes, to processing largetraining datasets through supervised machine learning algorithms. Resources for these activities include hardware fromclusters to supercomputers, scalable algorithms, and extensiveefforts in coding, applied mathematics, and domain-specificapplications. This topic covers recent computing developments and highlights the challenges of adapting complex,legacy, or mission-critical codes to the latest, and next, generation of rapidly evolving architectures. Machine learningmethods for emulating computationally expensive physicsmodels, validation, and uncertainty quantification (UQ) arealso discussed. Developments needed to realize the potentialof quantum computing (QC) for nuclear data, far beyond thebounds of classical computing, were also presented.2. Predictive codes for isotope productionIn situations and energies where well-characterized experimental data on cross sections or isotopic yields areunavailable, the isotope production community, as well asother users of these data, relies upon predictive codes toprovide estimates of needed data. Unfortunately, accuratemodeling of even moderately high-energy reactions is notoriously difficult. The lack of an acceptable predictive capabilityin modern reaction codes presents a cross-cutting need for thenuclear data community, as it impacts the casual user of thesecodes, the data evaluation pipeline, and applications such asisotope production, neutronics, shielding, and detection. Witha broad range of applications and an impact on multiple programs, this topic is of great interest. This section focuses onhow to improve the predictive capabilities of these codes tobenefit the breadth of the data community.3. Expanded benchmarks and validation for nuclear dataBecause much of nuclear science and engineering relies onpredictive computational modeling and simulation, many areas of the community would benefit from the development ofwell-characterized and documented benchmarks for code validation. While critical assembly benchmarks are very usefulfor validating some aspects of nuclear data, a broader suite ofbenchmarks are needed to provide more complete validationof nuclear data and physics important for other applications.021001-3

KAROLINA KOLOS et al.PHYSICAL REVIEW RESEARCH 4, 021001 (2022)There are many different applications that can leverage theframework used by the criticality safety and reactor physicscommunities to develop benchmarks needed to validate thenuclear data they depend on. New and historical experimentsthat could be turned into benchmarks to strengthen nucleardata validation in cross-cutting application areas was a majorfocus of this discussion.4. Nuclear data for space applicationsThe space radiation environment is a complex mix of photons, electrons, protons, and heavy ions with energies rangingfrom several eV to several TeV per nucleon. Characterizinginteractions in the environment of space is important in anumber of areas critical for space research and explorationdue to the secondary radiation fields they create. For example,creating effective shielding for crew and electronics requiresfundamental cross section data on high-energy heavy-ion interactions that produce complex secondary radiation fields.Similarly, the secondary neutrons and gamma rays producedby interactions of cosmic rays with the surfaces of planets,moons, and asteroids enable their chemical composition tobe characterized through the use of nuclear spectroscopy.Converting measurements to elemental information requiresknowledge of relevant neutron inelastic and capture cross sections and gamma-decay intensities. As space agencies aroundthe world prepare for human exploration beyond low-Earthorbit, there is renewed interest in fission power and radioisotope systems. These systems introduce an additional sourceof radiation that can impact instrument response and crewhealth. Nuclear data relevant to the performance of man-maderadiation environments and their interaction with surroundingmaterials is necessary to understand their impacts on thesemissions.5. Nuclear data for advanced reactors and security applicationsNuclear data impacts design, efficiency and operation ofadvanced reactors and security applications. With new advanced reactors and micro-reactors being designed usingdifferent fuels, coolants, and moderators than the current fleet,there is a potential need for improved nuclear data, including new differential and integral measurements, as well asnew evaluations. Security applications are even more diverse,covering a large range of detectors, systems, and interactions.There is also a large overlap in the nuclear data needs of thesetwo areas, especially for microreactors. The essential questions to address in this topical area are where refined nucleardata can increase safety, reliability, and economic viability.6. The human pipeline for nuclear dataResearchers play a key role along the entire nuclear datapipeline, not only contributing effort to process data throughthe pipeline, but also to improve links between pipeline components and to advance the underlying fundamental physics.However, the subcritical, aging, homogeneous nuclear dataworkforce must be transformed to evolve the pipeline to meetthe growing international demand for nuclear data, to branchout into new application areas, to embrace new advances inbig data, to transfer knowledge to the next generation, to benefit from the available diversity of thought, to attract youngerresearchers, and to continue to keep the ENDF and ENSDFdatabases as international standards of nuclear information.Some initial efforts to address these critical issues are described in Sec. VII.II. ADVANCED COMPUTING FOR NUCLEAR DATAComputing plays a central role in the nuclear data pipeline,from the analysis of data collected through experiments tothe production of evaluated data to the use of these datain applications. The collection and analysis of experimentaldata strongly leverages computing for data acquisition andto execute mathematical analyses including signal processingtechniques, statistical methods, and much more. Evaluationsrely on a set of theoretical models, implemented in nuclearphysics codes, to simulate the structure, reactions, and decayof atomic nuclei. Nuclear data are then used by applicationspecific simulation codes, e.g., computer programs simulatingthe structure of a neutron star, the formation of elementsin nucleosynthesis, critical assemblies, or reaction networksfor active interrogation. Because of the inherent complexityof nuclear processes and the often multiphysics nature ofnuclear data application codes, quantifying and propagatinguncertainties of the data throughout the pipeline also playsan essential role in the nuclear data community. Many of thestatistical methods used for UQ require significant computingresources.Thanks to advances in computing and in the understanding of the nuclear many-body problem, nuclear theory hasbecome ever more sophisticated with descriptions of the structure and reactions of light nuclei [43,44], low-lying states inmedium-mass nuclei [45,46], the mean-field description ofheavy nuclei [47], and improved theories of nuclear fission[48,49]. A broad range of fundamental nuclear theory problems from neutrino physics to fission to neutron reactionsthat are highly relevant to the nuclear data community werein fact identified as priority research directions requiring thedevelopment of exascale computers [50,51]. By integratingsome of these theoretical developments into the nuclear datapipeline, there is a unique opportunity to increase the fidelityof evaluations. This approach anchors the calculation of nuclear observables to the best knowledge of nuclear forces andquantum many-body methods, thereby improving the underlying physical foundations of the data. However, such a taskrequires a long-term vision for code development to keep pacewith hardware developments, robust software maintenanceplans, and personnel with cross-cutting skills in softwareengineering and nuclear science. Revising legacy codes tofully exploit new features of the latest hardware architectures,especially GPU-based ones, often requires expert assistanceand collaboration with computer scientists.Similar challenges are encountered in the developmentof popular transport codes such as, e.g., MCNP [52] orTRIPOLI [53], that are used to simulate many nuclear systems including reactors, nondestructive assays, and isotopeproduction. In contrast to nuclear physics models, the linear Boltzman transport equation is well understood, so theprimary computational challenges involve system geometry,numerical precision, or the need to calculate sensitivitiesto all integral quantities, all of which require susbtantial021001-4

CURRENT NUCLEAR DATA NEEDS FOR APPLICATIONSPHYSICAL REVIEW RESEARCH 4, 021001 (2022)computational throughput. These observations also apply tocomputer programs implementing the reaction network simulations relevant for stockpile stewardship or nucleosynthesis,where the simulation uncertainties primarily arise from input nuclear physics uncertainties rather than the underlyingthermodynamic conditions. The sensitivity of criticality calculations or astrophysics simulations to nuclear data inputsare examples of grand challenge problems that require leveraging high-performance computing (HPC) techniques andresources.In addition to nuclear theory, transport codes, and networksimulations, artificial intelligence (AI) and machine learning (ML) are driving a significant expansion of the role ofcomputing in nuclear data. AI/ML has already seen applications throughout the sciences in the areas of design, control,augmented simulations, science and math comprehension,generative models, inverse problems, multimodal learning,and decision making [54]. In the nuclear data pipeline, ithas been used for knowledge extraction, automation, surrogate models, and uncertainty quantification [55], and its useis anticipated to grow exponentially for a number of reasons. First, AI/ML enables new approaches, often originatingin other fields, to address longstanding problems in nucleardata. Second, new open source software libraries are availablethat facilitate the use of AI/ML algorithms with both CPUsand GPUs. These Python-based software frameworks [56,57]include tools for classification, prediction, ML via deep, recursive, and/or convolutional neural nets, and natural languageprocessing. These libraries are not, however, completely plugand-play solutions, and collaborations with AI/ML expertsand statisticians are often needed to exploit their full potentialfor nuclear data applications. Third, there is an intense interest of (especially early career) researchers to apply AI/MLapproaches to challenging data-intensive problems, providingan exceptional opportunity for AI/ML to serve as a recruitinggateway for the nuclear data field. These last two points areaddressed further in Sec. VII.Finally, simulation of quantum many-body systems, suchas nuclear reactions, requires exponentially increasing classical computing resources as the number of particles increases.In theory, universal quantum computers can achieve the sameexponential scaling, with the upshot that a quantum computer with thousands of qubits could simulate some nuclearreactions not possible even on future exascale classical supercomputers [58]. Moreover, because quantum computersare unitary, they are ideal for simulating quantum realtime evolution such as in nuclear interactions. Quantumsupremacy—performing a calculation on a quantum computerimpossible on a classical supercomputer—has been demonstrated, albeit on carefully selected problems that are currentlylargely uninteresting other than for tractability on currentquantum computing hardware [59,60]. It is thus relevant todetermine the potential of QC in the particular area of nucleardata.This section addresses the state-of-the-art of advancedcomputing in three primary focus areas and the associatedopportunities for the nuclear data community. Section II Aprovides an overview of current and emerging HPC technologies in the context of nuclear data needs and applications.Section II B addresses the ways in which AI/ML may beapplied to advance capabilities at all stages of the nuclear datapipeline. Section II C explores the opportunities and limitations to address nuclear data problems.A. High-fidelity modeling and simulation withhigh-performance computingWith the increasing sophistication of modeling and simulation approaches and the expanding number and size ofavailable datasets, capabilities to address nuclear data needsand applications are increasingly reliant upon powerful HPCtools for efficient execution. HPC methods may be appliedto advance computational nuclear structure and reactions byincreasing the performance of existing nuclear physics codesand enabling more elaborate theoretical modeling including previously inaccessible complex multiphysics calculations[61]. In fundamental nuclear theory research, novel methods to perform ab initio calculations of nuclei, such ascoupled-cluster [62] or in-medium similarity renormalizationgroup [63], have only become possible thanks to progress inHPC. New insights into the structure of neutron stars [64]or the formation of heavy elements in the universe [65–67]rely critically on complex simulations of nuclear propertieson supercomputers [68,69]. Multidisciplinary collaborationsinvolving applied mathematicians, computer scientists anddomain scientists are often key to enabling such progress [70].The Scientific discovery through advanced computing (SciDAC) program [71] and the Fission In R-process Elements(FIRE) topical collaboration in nuclear theory [72] are examples of how to organize and support such multidisciplinarycollaborations.HPC can also play important roles in the verification ofmethods and codes and in validation of commonly usedapproximations, by testing against more fundamental andpredictive theories. Examples include ab initio calculationsof thermonuclear reactions that can test the correctness ofmore phenomenological R-matrix fits [73], explanations ofβ-decay rate quenching with microscopic methods [74], or thequantum-mechanical simulation of quantities that are essential for simulating the deexcitaiton of fission fragments [75].By pr

10Rensselaer Polytechnic Institute, Department of Mechanical, Aerospace, and Nuclear Engineering, 110 8th Street, Troy, New York 12180, USA . University of Michigan recently developed Fission Sphere (FS-3), an array of forty organic stilbene detectors operated in time-coincidence [156,157]. The FS-3 is used to measure