DE-NE0008277_PROTEUS final technical report 2018
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enqvist, Andreas
This project details re-evaluations of experiments of gas-cooled fast reactor (GCFR) core designs performed in the 1970s at the PROTEUS reactor and create a series of International Reactor Physics Experiment Evaluation Project (IRPhEP) benchmarks. Currently there are no gas-cooled fast reactor (GCFR) experiments available in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). These experiments are excellent candidates for reanalysis and development of multiple benchmarks because these experiments provide high-quality integral nuclear data relevant to the validation and refinement of thorium, neptunium, uranium, plutonium, iron, and graphite cross sections. It would be cost prohibitive to reproduce suchmore » a comprehensive suite of experimental data to support any future GCFR endeavors.« less
GROWTH OF THE INTERNATIONAL CRITICALITY SAFETY AND REACTOR PHYSICS EXPERIMENT EVALUATION PROJECTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Blair Briggs; John D. Bess; Jim Gulliford
2011-09-01
Since the International Conference on Nuclear Criticality Safety (ICNC) 2007, the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) have continued to expand their efforts and broaden their scope. Eighteen countries participated on the ICSBEP in 2007. Now, there are 20, with recent contributions from Sweden and Argentina. The IRPhEP has also expanded from eight contributing countries in 2007 to 16 in 2011. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments1' have increased from 442 evaluations (38000 pages), containing benchmark specifications for 3955 critical ormore » subcritical configurations to 516 evaluations (nearly 55000 pages), containing benchmark specifications for 4405 critical or subcritical configurations in the 2010 Edition of the ICSBEP Handbook. The contents of the Handbook have also increased from 21 to 24 criticality-alarm-placement/shielding configurations with multiple dose points for each, and from 20 to 200 configurations categorized as fundamental physics measurements relevant to criticality safety applications. Approximately 25 new evaluations and 150 additional configurations are expected to be added to the 2011 edition of the Handbook. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments2' have increased from 16 different experimental series that were performed at 12 different reactor facilities to 53 experimental series that were performed at 30 different reactor facilities in the 2011 edition of the Handbook. Considerable effort has also been made to improve the functionality of the searchable database, DICE (Database for the International Criticality Benchmark Evaluation Project) and verify the accuracy of the data contained therein. DICE will be discussed in separate papers at ICNC 2011. The status of the ICSBEP and the IRPhEP will be discussed in the full paper, selected benchmarks that have been added to the ICSBEP Handbook will be highlighted, and a preview of the new benchmarks that will appear in the September 2011 edition of the Handbook will be provided. Accomplishments of the IRPhEP will also be highlighted and the future of both projects will be discussed. REFERENCES (1) International Handbook of Evaluated Criticality Safety Benchmark Experiments, NEA/NSC/DOC(95)03/I-IX, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), September 2010 Edition, ISBN 978-92-64-99140-8. (2) International Handbook of Evaluated Reactor Physics Benchmark Experiments, NEA/NSC/DOC(2006)1, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), March 2011 Edition, ISBN 978-92-64-99141-5.« less
Bess, John D.; Fujimoto, Nozomu
2014-10-09
Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bobrov, A. A.; Boyarinov, V. F.; Glushkov, A. E.
2012-07-01
Results of critical experiments performed at five ASTRA facility configurations modeling the high-temperature helium-cooled graphite-moderated reactors are presented. Results of experiments on definition of space distribution of {sup 235}U fission reaction rate performed at four from these five configurations are presented more detail. Analysis of available information showed that all experiments on criticality at these five configurations are acceptable for use them as critical benchmark experiments. All experiments on definition of space distribution of {sup 235}U fission reaction rate are acceptable for use them as physical benchmark experiments. (authors)
Benchmarking the Physical Therapist Academic Environment to Understand the Student Experience.
Shields, Richard K; Dudley-Javoroski, Shauna; Sass, Kelly J; Becker, Marcie
2018-04-19
Identifying excellence in physical therapist academic environments is complicated by the lack of nationally available benchmarking data. The objective of this study was to compare a physical therapist academic environment to another health care profession (medicine) academic environment using the Association of American Medical Colleges Graduation Questionnaire (GQ) survey. The design consisted of longitudinal benchmarking. Between 2009 and 2017, the GQ was administered to graduates of a physical therapist education program (Department of Physical Therapy and Rehabilitation Science, Carver College of Medicine, The University of Iowa [PTRS]). Their ratings of the educational environment were compared to nationwide data for a peer health care profession (medicine) educational environment. Benchmarking to the GQ capitalizes on a large, psychometrically validated database of academic domains that may be broadly applicable to health care education. The GQ captures critical information about the student experience (eg, faculty professionalism, burnout, student mistreatment) that can be used to characterize the educational environment. This study hypothesized that the ratings provided by 9 consecutive cohorts of PTRS students (n = 316) would reveal educational environment differences from academic medical education. PTRS students reported significantly higher ratings of the educational emotional climate and student-faculty interactions than medical students. PTRS and medical students did not differ on ratings of empathy and tolerance for ambiguity. PTRS students reported significantly lower ratings of burnout than medical students. PTRS students descriptively reported observing greater faculty professionalism and experiencing less mistreatment than medical students. The generalizability of these findings to other physical therapist education environments has not been established. Selected elements of the GQ survey revealed differences in the educational environments experienced by physical therapist students and medical students. All physical therapist academic programs should adopt a universal method to benchmark the educational environment to understand the student experience.
Benchmark Evaluation of HTR-PROTEUS Pebble Bed Experimental Program
Bess, John D.; Montierth, Leland; Köberl, Oliver; ...
2014-10-09
Benchmark models were developed to evaluate 11 critical core configurations of the HTR-PROTEUS pebble bed experimental program. Various additional reactor physics measurements were performed as part of this program; currently only a total of 37 absorber rod worth measurements have been evaluated as acceptable benchmark experiments for Cores 4, 9, and 10. Dominant uncertainties in the experimental keff for all core configurations come from uncertainties in the ²³⁵U enrichment of the fuel, impurities in the moderator pebbles, and the density and impurity content of the radial reflector. Calculations of k eff with MCNP5 and ENDF/B-VII.0 neutron nuclear data aremore » greater than the benchmark values but within 1% and also within the 3σ uncertainty, except for Core 4, which is the only randomly packed pebble configuration. Repeated calculations of k eff with MCNP6.1 and ENDF/B-VII.1 are lower than the benchmark values and within 1% (~3σ) except for Cores 5 and 9, which calculate lower than the benchmark eigenvalues within 4σ. The primary difference between the two nuclear data libraries is the adjustment of the absorption cross section of graphite. Simulations of the absorber rod worth measurements are within 3σ of the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less
Summary of ORSphere critical and reactor physics measurements
NASA Astrophysics Data System (ADS)
Marshall, Margaret A.; Bess, John D.
2017-09-01
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s. The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with the GODIVA I experiments. This critical configuration has been evaluated. Preliminary results were presented at ND2013. Since then, the evaluation was finalized and judged to be an acceptable benchmark experiment for the International Criticality Safety Benchmark Experiment Project (ICSBEP). Additionally, reactor physics measurements were performed to determine surface button worths, central void worth, delayed neutron fraction, prompt neutron decay constant, fission density and neutron importance. These measurements have been evaluated and found to be acceptable experiments and are discussed in full detail in the International Handbook of Evaluated Reactor Physics Benchmark Experiments. The purpose of this paper is to summarize all the evaluated critical and reactor physics measurements evaluations.
New Reactor Physics Benchmark Data in the March 2012 Edition of the IRPhEP Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2012-11-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications. Numerous experiments that have been performed worldwide, represent a large investment of infrastructure, expertise, and cost, and are valuable resources of data for present and future research. These valuable assets provide the basis for recording, development, and validation of methods. If the experimental data are lost, the high cost to repeat many of these measurements may be prohibitive. The purpose of the IRPhEP is to provide an extensively peer-reviewed set ofmore » reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. Contributors from around the world collaborate in the evaluation and review of selected benchmark experiments for inclusion in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [1]. Several new evaluations have been prepared for inclusion in the March 2012 edition of the IRPhEP Handbook.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; Jim Gulliford
2014-10-01
The International Reactor Physics Experiment Evaluation Project (IRPhEP) is a widely recognized world class program. The work of the IRPhEP is documented in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Integral data from the IRPhEP Handbook is used by reactor safety and design, nuclear data, criticality safety, and analytical methods development specialists, worldwide, to perform necessary validations of their calculational techniques. The IRPhEP Handbook is among the most frequently quoted reference in the nuclear industry and is expected to be a valuable resource for future decades.
Summary of ORSphere Critical and Reactor Physics Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Margaret A.; Bess, John D.
In the early 1970s Dr. John T. Mihalczo (team leader), J. J. Lynn, and J. R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s. The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with the GODIVAmore » I experiments. This critical configuration has been evaluated. Preliminary results were presented at ND2013. Since then, the evaluation was finalized and judged to be an acceptable benchmark experiment for the International Criticality Safety Benchmark Experiment Project (ICSBEP). Additionally, reactor physics measurements were performed to determine surface button worths, central void worth, delayed neutron fraction, prompt neutron decay constant, fission density and neutron importance. These measurements have been evaluated and found to be acceptable experiments and are discussed in full detail in the International Handbook of Evaluated Reactor Physics Benchmark Experiments. The purpose of this paper is summary summarize all the critical and reactor physics measurements evaluations and, when possible, to compare them to GODIVA experiment results.« less
INTEGRAL BENCHMARK DATA FOR NUCLEAR DATA TESTING THROUGH THE ICSBEP AND THE NEWLY ORGANIZED IRPHEP
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Blair Briggs; Lori Scott; Yolanda Rugama
The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) was last reported in a nuclear data conference at the International Conference on Nuclear Data for Science and Technology, ND-2004, in Santa Fe, New Mexico. Since that time the number and type of integral benchmarks have increased significantly. Included in the ICSBEP Handbook are criticality-alarm / shielding and fundamental physic benchmarks in addition to the traditional critical / subcritical benchmark data. Since ND 2004, a reactor physics counterpart to the ICSBEP, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was initiated. The IRPhEP is patterned after the ICSBEP, butmore » focuses on other integral measurements, such as buckling, spectral characteristics, reactivity effects, reactivity coefficients, kinetics measurements, reaction-rate and power distributions, nuclide compositions, and other miscellaneous-type measurements in addition to the critical configuration. The status of these two projects is discussed and selected benchmarks highlighted in this paper.« less
Providing Nuclear Criticality Safety Analysis Education through Benchmark Experiment Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; J. Blair Briggs; David W. Nigg
2009-11-01
One of the challenges that today's new workforce of nuclear criticality safety engineers face is the opportunity to provide assessment of nuclear systems and establish safety guidelines without having received significant experience or hands-on training prior to graduation. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and/or the International Reactor Physics Experiment Evaluation Project (IRPhEP) provides students and young professionals the opportunity to gain experience and enhance critical engineering skills.
Marshall, Margaret A.
2014-11-04
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an effort to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s. The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with themore » GODIVA I experiments. Additionally, various material reactivity worths, the surface material worth coefficient, the delayed neutron fraction, the prompt neutron decay constant, relative fission density, and relative neutron importance were all measured. The critical assembly, material reactivity worths, the surface material worth coefficient, and the delayed neutron fraction were all evaluated as benchmark experiment measurements. The reactor physics measurements are the focus of this paper; although for clarity the critical assembly benchmark specifications are briefly discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Sterbentz, James W.; Snoj, Luka
PROTEUS is a zero-power research reactor based on a cylindrical graphite annulus with a central cylindrical cavity. The graphite annulus remains basically the same for all experimental programs, but the contents of the central cavity are changed according to the type of reactor being investigated. Through most of its service history, PROTEUS has represented light-water reactors, but from 1992 to 1996 PROTEUS was configured as a pebble-bed reactor (PBR) critical facility and designated as HTR-PROTEUS. The nomenclature was used to indicate that this series consisted of High Temperature Reactor experiments performed in the PROTEUS assembly. During this period, seventeen criticalmore » configurations were assembled and various reactor physics experiments were conducted. These experiments included measurements of criticality, differential and integral control rod and safety rod worths, kinetics, reaction rates, water ingress effects, and small sample reactivity effects (Ref. 3). HTR-PROTEUS was constructed, and the experimental program was conducted, for the purpose of providing experimental benchmark data for assessment of reactor physics computer codes. Considerable effort was devoted to benchmark calculations as a part of the HTR-PROTEUS program. References 1 and 2 provide detailed data for use in constructing models for codes to be assessed. Reference 3 is a comprehensive summary of the HTR-PROTEUS experiments and the associated benchmark program. This document draws freely from these references. Only Cores 9 and 10 are evaluated in this benchmark report due to similarities in their construction. The other core configurations of the HTR-PROTEUS program are evaluated in their respective reports as outlined in Section 1.0. Cores 9 and 10 were evaluated and determined to be acceptable benchmark experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.
2014-03-01
PROTEUS is a zero-power research reactor based on a cylindrical graphite annulus with a central cylindrical cavity. The graphite annulus remains basically the same for all experimental programs, but the contents of the central cavity are changed according to the type of reactor being investigated. Through most of its service history, PROTEUS has represented light-water reactors, but from 1992 to 1996 PROTEUS was configured as a pebble-bed reactor (PBR) critical facility and designated as HTR-PROTEUS. The nomenclature was used to indicate that this series consisted of High Temperature Reactor experiments performed in the PROTEUS assembly. During this period, seventeen criticalmore » configurations were assembled and various reactor physics experiments were conducted. These experiments included measurements of criticality, differential and integral control rod and safety rod worths, kinetics, reaction rates, water ingress effects, and small sample reactivity effects (Ref. 3). HTR-PROTEUS was constructed, and the experimental program was conducted, for the purpose of providing experimental benchmark data for assessment of reactor physics computer codes. Considerable effort was devoted to benchmark calculations as a part of the HTR-PROTEUS program. References 1 and 2 provide detailed data for use in constructing models for codes to be assessed. Reference 3 is a comprehensive summary of the HTR-PROTEUS experiments and the associated benchmark program. This document draws freely from these references. Only Cores 9 and 10 are evaluated in this benchmark report due to similarities in their construction. The other core configurations of the HTR-PROTEUS program are evaluated in their respective reports as outlined in Section 1.0. Cores 9 and 10 were evaluated and determined to be acceptable benchmark experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess
2013-03-01
PROTEUS is a zero-power research reactor based on a cylindrical graphite annulus with a central cylindrical cavity. The graphite annulus remains basically the same for all experimental programs, but the contents of the central cavity are changed according to the type of reactor being investigated. Through most of its service history, PROTEUS has represented light-water reactors, but from 1992 to 1996 PROTEUS was configured as a pebble-bed reactor (PBR) critical facility and designated as HTR-PROTEUS. The nomenclature was used to indicate that this series consisted of High Temperature Reactor experiments performed in the PROTEUS assembly. During this period, seventeen criticalmore » configurations were assembled and various reactor physics experiments were conducted. These experiments included measurements of criticality, differential and integral control rod and safety rod worths, kinetics, reaction rates, water ingress effects, and small sample reactivity effects (Ref. 3). HTR-PROTEUS was constructed, and the experimental program was conducted, for the purpose of providing experimental benchmark data for assessment of reactor physics computer codes. Considerable effort was devoted to benchmark calculations as a part of the HTR-PROTEUS program. References 1 and 2 provide detailed data for use in constructing models for codes to be assessed. Reference 3 is a comprehensive summary of the HTR-PROTEUS experiments and the associated benchmark program. This document draws freely from these references. Only Cores 9 and 10 are evaluated in this benchmark report due to similarities in their construction. The other core configurations of the HTR-PROTEUS program are evaluated in their respective reports as outlined in Section 1.0. Cores 9 and 10 were evaluated and determined to be acceptable benchmark experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess
2013-03-01
PROTEUS is a zero-power research reactor based on a cylindrical graphite annulus with a central cylindrical cavity. The graphite annulus remains basically the same for all experimental programs, but the contents of the central cavity are changed according to the type of reactor being investigated. Through most of its service history, PROTEUS has represented light-water reactors, but from 1992 to 1996 PROTEUS was configured as a pebble-bed reactor (PBR) critical facility and designated as HTR-PROTEUS. The nomenclature was used to indicate that this series consisted of High Temperature Reactor experiments performed in the PROTEUS assembly. During this period, seventeen criticalmore » configurations were assembled and various reactor physics experiments were conducted. These experiments included measurements of criticality, differential and integral control rod and safety rod worths, kinetics, reaction rates, water ingress effects, and small sample reactivity effects (Ref. 3). HTR-PROTEUS was constructed, and the experimental program was conducted, for the purpose of providing experimental benchmark data for assessment of reactor physics computer codes. Considerable effort was devoted to benchmark calculations as a part of the HTR-PROTEUS program. References 1 and 2 provide detailed data for use in constructing models for codes to be assessed. Reference 3 is a comprehensive summary of the HTR-PROTEUS experiments and the associated benchmark program. This document draws freely from these references. Only Cores 9 and 10 are evaluated in this benchmark report due to similarities in their construction. The other core configurations of the HTR-PROTEUS program are evaluated in their respective reports as outlined in Section 1.0. Cores 9 and 10 were evaluated and determined to be acceptable benchmark experiments.« less
Benchmark Evaluation of the HTR-PROTEUS Absorber Rod Worths (Core 4)
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Leland M. Montierth
2014-06-01
PROTEUS was a zero-power research reactor at the Paul Scherrer Institute (PSI) in Switzerland. The critical assembly was constructed from a large graphite annulus surrounding a central cylindrical cavity. Various experimental programs were investigated in PROTEUS; during the years 1992 through 1996, it was configured as a pebble-bed reactor and designated HTR-PROTEUS. Various critical configurations were assembled with each accompanied by an assortment of reactor physics experiments including differential and integral absorber rod measurements, kinetics, reaction rate distributions, water ingress effects, and small sample reactivity effects [1]. Four benchmark reports were previously prepared and included in the March 2013 editionmore » of the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [2] evaluating eleven critical configurations. A summary of that effort was previously provided [3] and an analysis of absorber rod worth measurements for Cores 9 and 10 have been performed prior to this analysis and included in PROTEUS-GCR-EXP-004 [4]. In the current benchmark effort, absorber rod worths measured for Core Configuration 4, which was the only core with a randomly-packed pebble loading, have been evaluated for inclusion as a revision to the HTR-PROTEUS benchmark report PROTEUS-GCR-EXP-002.« less
Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements
Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.; ...
2014-11-04
Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less
Evaluation of Neutron Radiography Reactor LEU-Core Start-Up Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Maddock, Thomas L.; Smolinski, Andrew T.
Benchmark models were developed to evaluate the cold-critical start-up measurements performed during the fresh core reload of the Neutron Radiography (NRAD) reactor with Low Enriched Uranium (LEU) fuel. Experiments include criticality, control-rod worth measurements, shutdown margin, and excess reactivity for four core loadings with 56, 60, 62, and 64 fuel elements. The worth of four graphite reflector block assemblies and an empty dry tube used for experiment irradiations were also measured and evaluated for the 60-fuel-element core configuration. Dominant uncertainties in the experimental k eff come from uncertainties in the manganese content and impurities in the stainless steel fuel claddingmore » as well as the 236U and erbium poison content in the fuel matrix. Calculations with MCNP5 and ENDF/B-VII.0 neutron nuclear data are approximately 1.4% (9σ) greater than the benchmark model eigenvalues, which is commonly seen in Monte Carlo simulations of other TRIGA reactors. Simulations of the worth measurements are within the 2σ uncertainty for most of the benchmark experiment worth values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Briggs, J. Blair; Ivanova, Tatiana
2017-02-01
In the past several decades, numerous experiments have been performed worldwide to support reactor operations, measurements, design, and nuclear safety. Those experiments represent an extensive international investment in infrastructure, expertise, and cost, representing significantly valuable resources of data supporting past, current, and future research activities. Those valuable assets represent the basis for recording, development, and validation of our nuclear methods and integral nuclear data [1]. The loss of these experimental data, which has occurred all too much in the recent years, is tragic. The high cost to repeat many of these measurements can be prohibitive, if not impossible, to surmount.more » Two international projects were developed, and are under the direction of the Organisation for Co-operation and Development Nuclear Energy Agency (OECD NEA) to address the challenges of not just data preservation, but evaluation of the data to determine its merit for modern and future use. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was established to identify and verify comprehensive critical benchmark data sets; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data [2]. Similarly, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications [3]. Annually, contributors from around the world continue to collaborate in the evaluation and review of select benchmark experiments for preservation and dissemination. The extensively peer-reviewed integral benchmark data can then be utilized to support nuclear design and safety analysts to validate the analytical tools, methods, and data needed for next-generation reactor design, safety analysis requirements, and all other front- and back-end activities contributing to the overall nuclear fuel cycle where quality neutronics calculations are paramount.« less
Integral Full Core Multi-Physics PWR Benchmark with Measured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forget, Benoit; Smith, Kord; Kumar, Shikhar
In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less
Methodology and issues of integral experiments selection for nuclear data validation
NASA Astrophysics Data System (ADS)
Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian
2017-09-01
Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lell, R. M.; Schaefer, R. W.; McKnight, R. D.
Over a period of 30 years more than a hundred Zero Power Reactor (ZPR) critical assemblies were constructed at Argonne National Laboratory. The ZPR facilities, ZPR-3, ZPR-6, ZPR-9 and ZPPR, were all fast critical assembly facilities. The ZPR critical assemblies were constructed to support fast reactor development, but data from some of these assemblies are also well suited to form the basis for criticality safety benchmarks. Of the three classes of ZPR assemblies, engineering mockups, engineering benchmarks and physics benchmarks, the last group tends to be most useful for criticality safety. Because physics benchmarks were designed to test fast reactormore » physics data and methods, they were as simple as possible in geometry and composition. The principal fissile species was {sup 235}U or {sup 239}Pu. Fuel enrichments ranged from 9% to 95%. Often there were only one or two main core diluent materials, such as aluminum, graphite, iron, sodium or stainless steel. The cores were reflected (and insulated from room return effects) by one or two layers of materials such as depleted uranium, lead or stainless steel. Despite their more complex nature, a small number of assemblies from the other two classes would make useful criticality safety benchmarks because they have features related to criticality safety issues, such as reflection by soil-like material. The term 'benchmark' in a ZPR program connotes a particularly simple loading aimed at gaining basic reactor physics insight, as opposed to studying a reactor design. In fact, the ZPR-6/7 Benchmark Assembly (Reference 1) had a very simple core unit cell assembled from plates of depleted uranium, sodium, iron oxide, U3O8, and plutonium. The ZPR-6/7 core cell-average composition is typical of the interior region of liquid-metal fast breeder reactors (LMFBRs) of the era. It was one part of the Demonstration Reactor Benchmark Program,a which provided integral experiments characterizing the important features of demonstration-size LMFBRs. As a benchmark, ZPR-6/7 was devoid of many 'real' reactor features, such as simulated control rods and multiple enrichment zones, in its reference form. Those kinds of features were investigated experimentally in variants of the reference ZPR-6/7 or in other critical assemblies in the Demonstration Reactor Benchmark Program.« less
Educating Next Generation Nuclear Criticality Safety Engineers at the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. D. Bess; J. B. Briggs; A. S. Garcia
2011-09-01
One of the challenges in educating our next generation of nuclear safety engineers is the limitation of opportunities to receive significant experience or hands-on training prior to graduation. Such training is generally restricted to on-the-job-training before this new engineering workforce can adequately provide assessment of nuclear systems and establish safety guidelines. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) can provide students and young professionals the opportunity to gain experience and enhance critical engineering skills. The ICSBEP and IRPhEP publish annual handbooks that contain evaluations of experiments along withmore » summarized experimental data and peer-reviewed benchmark specifications to support the validation of neutronics codes, nuclear cross-section data, and the validation of reactor designs. Participation in the benchmark process not only benefits those who use these Handbooks within the international community, but provides the individual with opportunities for professional development, networking with an international community of experts, and valuable experience to be used in future employment. Traditionally students have participated in benchmarking activities via internships at national laboratories, universities, or companies involved with the ICSBEP and IRPhEP programs. Additional programs have been developed to facilitate the nuclear education of students while participating in the benchmark projects. These programs include coordination with the Center for Space Nuclear Research (CSNR) Next Degree Program, the Collaboration with the Department of Energy Idaho Operations Office to train nuclear and criticality safety engineers, and student evaluations as the basis for their Master's thesis in nuclear engineering.« less
Benchmarking atomic physics models for magnetically confined fusion plasma physics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, M.J.; Finkenthal, M.; Soukhanovskii, V.
In present magnetically confined fusion devices, high and intermediate {ital Z} impurities are either puffed into the plasma for divertor radiative cooling experiments or are sputtered from the high {ital Z} plasma facing armor. The beneficial cooling of the edge as well as the detrimental radiative losses from the core of these impurities can be properly understood only if the atomic physics used in the modeling of the cooling curves is very accurate. To this end, a comprehensive experimental and theoretical analysis of some relevant impurities is undertaken. Gases (Ne, Ar, Kr, and Xe) are puffed and nongases are introducedmore » through laser ablation into the FTU tokamak plasma. The charge state distributions and total density of these impurities are determined from spatial scans of several photometrically calibrated vacuum ultraviolet and x-ray spectrographs (3{endash}1600 {Angstrom}), the multiple ionization state transport code transport code (MIST) and a collisional radiative model. The radiative power losses are measured with bolometery, and the emissivity profiles were measured by a visible bremsstrahlung array. The ionization balance, excitation physics, and the radiative cooling curves are computed from the Hebrew University Lawrence Livermore atomic code (HULLAC) and are benchmarked by these experiments. (Supported by U.S. DOE Grant No. DE-FG02-86ER53214 at JHU and Contract No. W-7405-ENG-48 at LLNL.) {copyright} {ital 1999 American Institute of Physics.}« less
Gadolinia depletion analysis by CASMO-4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, Y.; Saji, E.; Toba, A.
1993-01-01
CASMO-4 is the most recent version of the lattice physics code CASMO introduced by Studsvik. The principal aspects of the CASMO-4 model that differ from the models in previous CASMO versions are as follows: (1) heterogeneous model for two-dimensional transport theory calculations; and (2) microregion depletion model for burnable absorbers, such as gadolinia. Of these aspects, the first has previously been benchmarked against measured data of critical experiments and Monte Carlo calculations, verifying the high degree of accuracy. To proceed with CASMO-4 benchmarking, it is desirable to benchmark the microregion depletion model, which enables CASMO-4 to calculate gadolinium depletion directlymore » without the need for precalculated MICBURN cross-section data. This paper presents the benchmarking results for the microregion depletion model in CASMO-4 using the measured data of depleted gadolinium rods.« less
Contributions to Integral Nuclear Data in ICSBEP and IRPhEP since ND 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Briggs, J. Blair; Gulliford, Jim
2016-09-01
The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) was last discussed directly with the international nuclear data community at ND2013. Since ND2013, integral benchmark data that are available for nuclear data testing has continued to increase. The status of the international benchmark efforts and the latest contributions to integral nuclear data for testing is discussed. Select benchmark configurations that have been added to the ICSBEP and IRPhEP Handbooks since ND2013 are highlighted. The 2015 edition of the ICSBEP Handbook now contains 567 evaluations with benchmark specifications for 4,874more » critical, near-critical, or subcritical configurations, 31 criticality alarm placement/shielding configuration with multiple dose points apiece, and 207 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. The 2015 edition of the IRPhEP Handbook contains data from 143 different experimental series that were performed at 50 different nuclear facilities. Currently 139 of the 143 evaluations are published as approved benchmarks with the remaining four evaluations published in draft format only. Measurements found in the IRPhEP Handbook include criticality, buckling and extrapolation length, spectral characteristics, reactivity effects, reactivity coefficients, kinetics, reaction-rate distributions, power distributions, isotopic compositions, and/or other miscellaneous types of measurements for various types of reactor systems. Annual technical review meetings for both projects were held in April 2016; additional approved benchmark evaluations will be included in the 2016 editions of these handbooks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, J. D.; Briggs, J. B.; Gulliford, J.
Overview of Experiments to Study the Physics of Fast Reactors Represented in the International Directories of Critical and Reactor Experiments John D. Bess Idaho National Laboratory Jim Gulliford, Tatiana Ivanova Nuclear Energy Agency of the Organisation for Economic Cooperation and Development E.V.Rozhikhin, M.Yu.Sem?nov, A.M.Tsibulya Institute of Physics and Power Engineering The study the physics of fast reactors traditionally used the experiments presented in the manual labor of the Working Group on Evaluation of sections CSEWG (ENDF-202) issued by the Brookhaven National Laboratory in 1974. This handbook presents simplified homogeneous model experiments with relevant experimental data, as amended. The Nuclear Energymore » Agency of the Organization for Economic Cooperation and Development coordinates the activities of two international projects on the collection, evaluation and documentation of experimental data - the International Project on the assessment of critical experiments (1994) and the International Project on the assessment of reactor experiments (since 2005). The result of the activities of these projects are replenished every year, an international directory of critical (ICSBEP Handbook) and reactor (IRPhEP Handbook) experiments. The handbooks present detailed models of experiments with minimal amendments. Such models are of particular interest in terms of the settlements modern programs. The directories contain a large number of experiments which are suitable for the study of physics of fast reactors. Many of these experiments were performed at specialized critical stands, such as BFS (Russia), ZPR and ZPPR (USA), the ZEBRA (UK) and the experimental reactor JOYO (Japan), FFTF (USA). Other experiments, such as compact metal assembly, is also of interest in terms of the physics of fast reactors, they have been carried out on the universal critical stands in Russian institutes (VNIITF and VNIIEF) and the US (LANL, LLNL, and others.). Also worth mentioning is the critical experiments with fast reactor fuel rods in water, interesting in terms of justification of nuclear safety during transportation and storage of fresh and spent fuel. These reports provide a detailed review of the experiment, designate the area of their application and include results of calculations on modern systems of constants in comparison with the estimated experimental data.« less
What can one learn from experiments about the elusive transition state?
Chang, Iksoo; Cieplak, Marek; Banavar, Jayanth R.; Maritan, Amos
2004-01-01
We present the results of an exact analysis of a model energy landscape of a protein to clarify the idea of the transition state and the physical meaning of the φ values determined in protein engineering experiments. We benchmark our findings to various theoretical approaches proposed in the literature for the identification and characterization of the transition state. PMID:15295118
The Paucity Problem: Where Have All the Space Reactor Experiments Gone?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Marshall, Margaret A.
2016-10-01
The Handbooks of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) together contain a plethora of documented and evaluated experiments essential in the validation of nuclear data, neutronics codes, and modeling of various nuclear systems. Unfortunately, only a minute selection of handbook data (twelve evaluations) are of actual experimental facilities and mockups designed specifically for space nuclear research. There is a paucity problem, such that the multitude of space nuclear experimental activities performed in the past several decades have yet to be recovered and made available in such detail that themore » international community could benefit from these valuable historical research efforts. Those experiments represent extensive investments in infrastructure, expertise, and cost, as well as constitute significantly valuable resources of data supporting past, present, and future research activities. The ICSBEP and IRPhEP were established to identify and verify comprehensive sets of benchmark data; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data. See full abstract in attached document.« less
Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.
2017-01-01
Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.
Orsphere: Physics Measurments For Bare, HEU(93.2)-Metal Sphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Margaret A.; Bess, John D.; Briggs, J. Blair
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an attempt to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s (HEU-MET-FAST-001). The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared withmore » the GODIVA I experiments. “The very accurate description of this sphere, as assembled, establishes it as an ideal benchmark for calculational methods and cross-section data files” (Reference 1). While performing the ORSphere experiments care was taken to accurately document component dimensions (±0.0001 inches), masses (±0.01 g), and material data. The experiment was also set up to minimize the amount of structural material in the sphere proximity. Two, correlated spheres were evaluated and judged to be acceptable as criticality benchmark experiments. This evaluation is given in HEU-MET-FAST-100. The second, smaller sphere was used for additional reactor physics measurements. Worth measurements (Reference 1, 2, 3 and 4), the delayed neutron fraction (Reference 3, 4 and 5) and surface material worth coefficient (Reference 1 and 2) are all measured and judged to be acceptable as benchmark data. The prompt neutron decay (Reference 6), relative fission density (Reference 7) and relative neutron importance (Reference 7) were measured, but are not evaluated. Information for the evaluation was compiled from References 1 through 7, the experimental logbooks 8 and 9 ; additional drawings and notes provided by the experimenter; and communication with the lead experimenter, John T. Mihalczo.« less
GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise Paul
This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. •more » The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary. 09/2016: Tables 6 and 8 updated. AGR-2 input data added« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise P.
2014-09-01
This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparisonmore » of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lell, R. M.; McKnight, R. D.; Tsiboulia, A.
2010-09-30
Over a period of 30 years, more than a hundred Zero Power Reactor (ZPR) critical assemblies were constructed at Argonne National Laboratory. The ZPR facilities, ZPR-3, ZPR-6, ZPR-9 and ZPPR, were all fast critical assembly facilities. The ZPR critical assemblies were constructed to support fast reactor development, but data from some of these assemblies are also well suited for nuclear data validation and to form the basis for criticality safety benchmarks. A number of the Argonne ZPR/ZPPR critical assemblies have been evaluated as ICSBEP and IRPhEP benchmarks. Of the three classes of ZPR assemblies, engineering mockups, engineering benchmarks and physicsmore » benchmarks, the last group tends to be most useful for criticality safety. Because physics benchmarks were designed to test fast reactor physics data and methods, they were as simple as possible in geometry and composition. The principal fissile species was {sup 235}U or {sup 239}Pu. Fuel enrichments ranged from 9% to 95%. Often there were only one or two main core diluent materials, such as aluminum, graphite, iron, sodium or stainless steel. The cores were reflected (and insulated from room return effects) by one or two layers of materials such as depleted uranium, lead or stainless steel. Despite their more complex nature, a small number of assemblies from the other two classes would make useful criticality safety benchmarks because they have features related to criticality safety issues, such as reflection by soil-like material. ZPR-3 Assembly 11 (ZPR-3/11) was designed as a fast reactor physics benchmark experiment with an average core {sup 235}U enrichment of approximately 12 at.% and a depleted uranium reflector. Approximately 79.7% of the total fissions in this assembly occur above 100 keV, approximately 20.3% occur below 100 keV, and essentially none below 0.625 eV - thus the classification as a 'fast' assembly. This assembly is Fast Reactor Benchmark No. 8 in the Cross Section Evaluation Working Group (CSEWG) Benchmark Specificationsa and has historically been used as a data validation benchmark assembly. Loading of ZPR-3 Assembly 11 began in early January 1958, and the Assembly 11 program ended in late January 1958. The core consisted of highly enriched uranium (HEU) plates and depleted uranium plates loaded into stainless steel drawers, which were inserted into the central square stainless steel tubes of a 31 x 31 matrix on a split table machine. The core unit cell consisted of two columns of 0.125 in.-wide (3.175 mm) HEU plates, six columns of 0.125 in.-wide (3.175 mm) depleted uranium plates and one column of 1.0 in.-wide (25.4 mm) depleted uranium plates. The length of each column was 10 in. (254.0 mm) in each half of the core. The axial blanket consisted of 12 in. (304.8 mm) of depleted uranium behind the core. The thickness of the depleted uranium radial blanket was approximately 14 in. (355.6 mm), and the length of the radial blanket in each half of the matrix was 22 in. (558.8 mm). The assembly geometry approximated a right circular cylinder as closely as the square matrix tubes allowed. According to the logbook and loading records for ZPR-3/11, the reference critical configuration was loading 10 which was critical on January 21, 1958. Subsequent loadings were very similar but less clean for criticality because there were modifications made to accommodate reactor physics measurements other than criticality. Accordingly, ZPR-3/11 loading 10 was selected as the only configuration for this benchmark. As documented below, it was determined to be acceptable as a criticality safety benchmark experiment. A very accurate transformation to a simplified model is needed to make any ZPR assembly a practical criticality-safety benchmark. There is simply too much geometric detail in an exact (as-built) model of a ZPR assembly, even a clean core such as ZPR-3/11 loading 10. The transformation must reduce the detail to a practical level without masking any of the important features of the critical experiment. And it must do this without increasing the total uncertainty far beyond that of the original experiment. Such a transformation is described in Section 3. It was obtained using a pair of continuous-energy Monte Carlo calculations. First, the critical configuration was modeled in full detail - every plate, drawer, matrix tube, and air gap was modeled explicitly. Then the regionwise compositions and volumes from the detailed as-built model were used to construct a homogeneous, two-dimensional (RZ) model of ZPR-3/11 that conserved the mass of each nuclide and volume of each region. The simple cylindrical model is the criticality-safety benchmark model. The difference in the calculated k{sub eff} values between the as-built three-dimensional model and the homogeneous two-dimensional benchmark model was used to adjust the measured excess reactivity of ZPR-3/11 loading 10 to obtain the k{sub eff} for the benchmark model.« less
Potential of mean force for electrical conductivity of dense plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starrett, C. E.
The electrical conductivity in dense plasmas can be calculated with the relaxation-time approximation provided that the interaction potential between the scattering electron and the ion is known. To date there has been considerable uncertainty as to the best way to define this interaction potential so that it correctly includes the effects of ionic structure, screening by electrons and partial ionization. The current approximations lead to significantly different results with varying levels of agreement when compared to bench-mark calculations and experiments. Here, we present a new way to define this potential, drawing on ideas from classical fluid theory to define amore » potential of mean force. This new potential results in significantly improved agreement with experiments and bench-mark calculations, and includes all the aforementioned physics self-consistently.« less
Potential of mean force for electrical conductivity of dense plasmas
Starrett, C. E.
2017-09-28
The electrical conductivity in dense plasmas can be calculated with the relaxation-time approximation provided that the interaction potential between the scattering electron and the ion is known. To date there has been considerable uncertainty as to the best way to define this interaction potential so that it correctly includes the effects of ionic structure, screening by electrons and partial ionization. The current approximations lead to significantly different results with varying levels of agreement when compared to bench-mark calculations and experiments. Here, we present a new way to define this potential, drawing on ideas from classical fluid theory to define amore » potential of mean force. This new potential results in significantly improved agreement with experiments and bench-mark calculations, and includes all the aforementioned physics self-consistently.« less
Potential of mean force for electrical conductivity of dense plasmas
NASA Astrophysics Data System (ADS)
Starrett, C. E.
2017-12-01
The electrical conductivity in dense plasmas can be calculated with the relaxation-time approximation provided that the interaction potential between the scattering electron and the ion is known. To date there has been considerable uncertainty as to the best way to define this interaction potential so that it correctly includes the effects of ionic structure, screening by electrons and partial ionization. Current approximations lead to significantly different results with varying levels of agreement when compared to bench-mark calculations and experiments. We present a new way to define this potential, drawing on ideas from classical fluid theory to define a potential of mean force. This new potential results in significantly improved agreement with experiments and bench-mark calculations, and includes all the aforementioned physics self-consistently.
Thought Experiment to Examine Benchmark Performance for Fusion Nuclear Data
NASA Astrophysics Data System (ADS)
Murata, Isao; Ohta, Masayuki; Kusaka, Sachie; Sato, Fuminobu; Miyamaru, Hiroyuki
2017-09-01
There are many benchmark experiments carried out so far with DT neutrons especially aiming at fusion reactor development. These integral experiments seemed vaguely to validate the nuclear data below 14 MeV. However, no precise studies exist now. The author's group thus started to examine how well benchmark experiments with DT neutrons can play a benchmarking role for energies below 14 MeV. Recently, as a next phase, to generalize the above discussion, the energy range was expanded to the entire region. In this study, thought experiments with finer energy bins have thus been conducted to discuss how to generally estimate performance of benchmark experiments. As a result of thought experiments with a point detector, the sensitivity for a discrepancy appearing in the benchmark analysis is "equally" due not only to contribution directly conveyed to the deterctor, but also due to indirect contribution of neutrons (named (A)) making neutrons conveying the contribution, indirect controbution of neutrons (B) making the neutrons (A) and so on. From this concept, it would become clear from a sensitivity analysis in advance how well and which energy nuclear data could be benchmarked with a benchmark experiment.
Bostelmann, Friederike; Hammer, Hans R.; Ortensi, Javier; ...
2015-12-30
Within the framework of the IAEA Coordinated Research Project on HTGR Uncertainty Analysis in Modeling, criticality calculations of the Very High Temperature Critical Assembly experiment were performed as the validation reference to the prismatic MHTGR-350 lattice calculations. Criticality measurements performed at several temperature points at this Japanese graphite-moderated facility were recently included in the International Handbook of Evaluated Reactor Physics Benchmark Experiments, and represent one of the few data sets available for the validation of HTGR lattice physics. Here, this work compares VHTRC criticality simulations utilizing the Monte Carlo codes Serpent and SCALE/KENO-VI. Reasonable agreement was found between Serpent andmore » KENO-VI, but only the use of the latest ENDF cross section library release, namely the ENDF/B-VII.1 library, led to an improved match with the measured data. Furthermore, the fourth beta release of SCALE 6.2/KENO-VI showed significant improvements from the current SCALE 6.1.2 version, compared to the experimental values and Serpent.« less
Benchmarking the ATLAS software through the Kit Validation engine
NASA Astrophysics Data System (ADS)
De Salvo, Alessandro; Brasolin, Franco
2010-04-01
The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.
Aerothermal modeling program, phase 1
NASA Technical Reports Server (NTRS)
Sturgess, G. J.
1983-01-01
The physical modeling embodied in the computational fluid dynamics codes is discussed. The objectives were to identify shortcomings in the models and to provide a program plan to improve the quantitative accuracy. The physical models studied were for: turbulent mass and momentum transport, heat release, liquid fuel spray, and gaseous radiation. The approach adopted was to test the models against appropriate benchmark-quality test cases from experiments in the literature for the constituent flows that together make up the combustor real flow.
EBR-II Reactor Physics Benchmark Evaluation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Chad L.; Lum, Edward S; Stewart, Ryan
This report provides a reactor physics benchmark evaluation with associated uncertainty quantification for the critical configuration of the April 1986 Experimental Breeder Reactor II Run 138B core configuration.
CMS Physics Technical Design Report, Volume II: Physics Performance
NASA Astrophysics Data System (ADS)
CMS Collaboration
2007-06-01
CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking—through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb -1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, B s production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb -1 to 30 fb -1 . The Standard Model processes include QCD, B -physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z 0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2 6 describe examples of full analyses, with photons, electrons, muons, jets, missing E T , B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7 15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Margaret A.
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an effort to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s. The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with themore » GODIVA I experiments. Additionally, various material reactivity worths, the surface material worth coefficient, the delayed neutron fraction, the prompt neutron decay constant, relative fission density, and relative neutron importance were all measured. The critical assembly, material reactivity worths, the surface material worth coefficient, and the delayed neutron fraction were all evaluated as benchmark experiment measurements. The reactor physics measurements are the focus of this paper; although for clarity the critical assembly benchmark specifications are briefly discussed.« less
Benchmarking Attosecond Physics with Atomic Hydrogen
2015-05-25
theoretical simulations are available in this regime. We provided accurate reference data on the photoionization yield and the CEP-dependent...this difficulty. This experiment claimed to show that, contrary to current understanding, the photoionization of an atomic electron is not an... photoion yield and transferrable intensity calibration. The dependence of photoionization probability on laser intensity is one of the most
Xia, Yuan; Deshpande, Sameer; Bonates, Tiberius
2016-11-01
Social marketing managers promote desired behaviors to an audience by making them tangible in the form of environmental opportunities to enhance benefits and reduce barriers. This study proposed "benchmarks," modified from those found in the past literature, that would match important concepts of the social marketing framework and the inclusion of which would ensure behavior change effectiveness. In addition, we analyzed behavior change interventions on a "social marketing continuum" to assess whether the number of benchmarks and the role of specific benchmarks influence the effectiveness of physical activity promotion efforts. A systematic review of social marketing interventions available in academic studies published between 1997 and 2013 revealed 173 conditions in 92 interventions. Findings based on χ 2 , Mallows' Cp, and Logical Analysis of Data tests revealed that the presence of more benchmarks in interventions increased the likelihood of success in promoting physical activity. The presence of more than 3 benchmarks improved the success of the interventions; specifically, all interventions were successful when more than 7.5 benchmarks were present. Further, primary formative research, core product, actual product, augmented product, promotion, and behavioral competition all had a significant influence on the effectiveness of interventions. Social marketing is an effective approach in promoting physical activity among adults when a substantial number of benchmarks are used and when managers understand the audience, make the desired behavior tangible, and promote the desired behavior persuasively.
The MCUCN simulation code for ultracold neutron physics
NASA Astrophysics Data System (ADS)
Zsigmond, G.
2018-02-01
Ultracold neutrons (UCN) have very low kinetic energies 0-300 neV, thereby can be stored in specific material or magnetic confinements for many hundreds of seconds. This makes them a very useful tool in probing fundamental symmetries of nature (for instance charge-parity violation by neutron electric dipole moment experiments) and contributing important parameters for the Big Bang nucleosynthesis (neutron lifetime measurements). Improved precision experiments are in construction at new and planned UCN sources around the world. MC simulations play an important role in the optimization of such systems with a large number of parameters, but also in the estimation of systematic effects, in benchmarking of analysis codes, or as part of the analysis. The MCUCN code written at PSI has been extensively used for the optimization of the UCN source optics and in the optimization and analysis of (test) experiments within the nEDM project based at PSI. In this paper we present the main features of MCUCN and interesting benchmark and application examples.
A Seafloor Benchmark for 3-dimensional Geodesy
NASA Astrophysics Data System (ADS)
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone. Using a ROV to place and remove sensors on the benchmarks will significantly reduce the number of sensors required by the community to monitor offshore strain in subduction zones.
ARL Physics Web Pages: An Evaluation by Established, Transitional and Emerging Benchmarks.
ERIC Educational Resources Information Center
Duffy, Jane C.
2002-01-01
Provides an overview of characteristics among Association of Research Libraries (ARL) physics Web pages. Examines current academic Web literature and from that develops six benchmarks to measure physics Web pages: ease of navigation; logic of presentation; representation of all forms of information; engagement of the discipline; interactivity of…
Simulation studies for the PANDA experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopf, B.
2005-10-26
One main component of the planned Facility for Antiproton and Ion Research (FAIR) is the High Energy Storage Ring (HESR) at GSI, Darmstadt, which will provide cooled antiprotons with momenta between 1.5 and 15 GeV/c. The PANDA experiment will investigate p-barannihilations with internal hydrogen and nuclear targets. Due to the planned extensive physics program a multipurpose detector with nearly complete solid angle coverage, proper particle identification over a large momentum range, and high resolution calorimetry for neutral particles is required. For the optimization of the detector design simulation studies of several benchmark channels are in progress which are covering themore » most relevant physics topics. Some important simulation results are discussed here.« less
Reconstruction of bar {p}p events in PANDA
NASA Astrophysics Data System (ADS)
Spataro, S.
2012-08-01
The PANDA experiment will study anti-proton proton and anti-proton nucleus collisions in the HESR complex of the facility FAIR, in a beam momentum range from 2 GeV jc up to 15 GeV/c. In preparation for the experiment, a software framework based on ROOT (PandaRoot) is being developed for the simulation, reconstruction and analysis of physics events, running also on a GRID infrastructure. Detailed geometry descriptions and different realistic reconstruction algorithms are implemented, currently used for the realization of the Technical Design Reports. The contribution will report about the reconstruction capabilities of the Panda spectrometer, focusing mainly on the performances of the tracking system and the results for the analysis of physics benchmark channels.
OECD-NEA Expert Group on Multi-Physics Experimental Data, Benchmarks and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, Timothy; Rohatgi, Upendra S.
High-fidelity, multi-physics modeling and simulation (M&S) tools are being developed and utilized for a variety of applications in nuclear science and technology and show great promise in their abilities to reproduce observed phenomena for many applications. Even with the increasing fidelity and sophistication of coupled multi-physics M&S tools, the underpinning models and data still need to be validated against experiments that may require a more complex array of validation data because of the great breadth of the time, energy and spatial domains of the physical phenomena that are being simulated. The Expert Group on Multi-Physics Experimental Data, Benchmarks and Validationmore » (MPEBV) of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) was formed to address the challenges with the validation of such tools. The work of the MPEBV expert group is shared among three task forces to fulfill its mandate and specific exercises are being developed to demonstrate validation principles for common industrial challenges. This paper describes the overall mission of the group, the specific objectives of the task forces, the linkages among the task forces, and the development of a validation exercise that focuses on a specific reactor challenge problem.« less
Benchmarking variable-density flow in saturated and unsaturated porous media
NASA Astrophysics Data System (ADS)
Guevara Morel, Carlos Roberto; Cremer, Clemens; Graf, Thomas
2015-04-01
In natural environments, fluid density and viscosity can be affected by spatial and temporal variations of solute concentration and/or temperature. These variations can occur, for example, due to salt water intrusion in coastal aquifers, leachate infiltration from waste disposal sites and upconing of saline water from deep aquifers. As a consequence, potentially unstable situations may exist in which a dense fluid overlies a less dense fluid. This situation can produce instabilities that manifest as dense plume fingers that move vertically downwards counterbalanced by vertical upwards flow of the less dense fluid. Resulting free convection increases solute transport rates over large distances and times relative to constant-density flow. Therefore, the understanding of free convection is relevant for the protection of freshwater aquifer systems. The results from a laboratory experiment of saturated and unsaturated variable-density flow and solute transport (Simmons et al., Transp. Porous Medium, 2002) are used as the physical basis to define a mathematical benchmark. The HydroGeoSphere code coupled with PEST are used to estimate the optimal parameter set capable of reproducing the physical model. A grid convergency analysis (in space and time) is also undertaken in order to obtain the adequate spatial and temporal discretizations. The new mathematical benchmark is useful for model comparison and testing of variable-density variably saturated flow in porous media.
How to Use Benchmark and Cross-section Studies to Improve Data Libraries and Models
NASA Astrophysics Data System (ADS)
Wagner, V.; Suchopár, M.; Vrzalová, J.; Chudoba, P.; Svoboda, O.; Tichý, P.; Krása, A.; Majerle, M.; Kugler, A.; Adam, J.; Baldin, A.; Furman, W.; Kadykov, M.; Solnyshkin, A.; Tsoupko-Sitnikov, S.; Tyutyunikov, S.; Vladimirovna, N.; Závorka, L.
2016-06-01
Improvements of the Monte Carlo transport codes and cross-section libraries are very important steps towards usage of the accelerator-driven transmutation systems. We have conducted a lot of benchmark experiments with different set-ups consisting of lead, natural uranium and moderator irradiated by relativistic protons and deuterons within framework of the collaboration “Energy and Transmutation of Radioactive Waste”. Unfortunately, the knowledge of the total or partial cross-sections of important reactions is insufficient. Due to this reason we have started extensive studies of different reaction cross-sections. We measure cross-sections of important neutron reactions by means of the quasi-monoenergetic neutron sources based on the cyclotrons at Nuclear Physics Institute in Řež and at The Svedberg Laboratory in Uppsala. Measurements of partial cross-sections of relativistic deuteron reactions were the second direction of our studies. The new results obtained during last years will be shown. Possible use of these data for improvement of libraries, models and benchmark studies will be discussed.
Engineering department physical plant staffing requirements.
Cole, C
1997-05-01
There is a considerable effort in the health care arena to establish credible engineering manpower yardsticks that are universally applicable as a benchmark. This document was created by using one facility's own benchmark criteria that can be used to help develop either internal or competitive benchmarking comparisons.
NASA Astrophysics Data System (ADS)
Koscheev, Vladimir; Manturov, Gennady; Pronyaev, Vladimir; Rozhikhin, Evgeny; Semenov, Mikhail; Tsibulya, Anatoly
2017-09-01
Several k∞ experiments were performed on the KBR critical facility at the Institute of Physics and Power Engineering (IPPE), Obninsk, Russia during the 1970s and 80s for study of neutron absorption properties of Cr, Mn, Fe, Ni, Zr, and Mo. Calculations of these benchmarks with almost any modern evaluated nuclear data libraries demonstrate bad agreement with the experiment. Neutron capture cross sections of the odd isotopes of Cr, Mn, Fe, and Ni in the ROSFOND-2010 library have been reevaluated and another evaluation of the Zr nuclear data has been adopted. Use of the modified nuclear data for Cr, Mn, Fe, Ni, and Zr leads to significant improvement of the C/E ratio for the KBR assemblies. Also a significant improvement in agreement between calculated and evaluated values for benchmarks with Fe reflectors was observed. C/E results obtained with the modified ROSFOND library for complex benchmark models that are highly sensitive to the cross sections of structural materials are no worse than results obtained with other major evaluated data libraries. Possible improvement in results by decreasing the capture cross section for Zr and Mo at the energies above 1 keV is indicated.
The Ongoing Impact of the U.S. Fast Reactor Integral Experiments Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Michael A. Pope; Harold F. McFarlane
2012-11-01
The creation of a large database of integral fast reactor physics experiments advanced nuclear science and technology in ways that were unachievable by less capital intensive and operationally challenging approaches. They enabled the compilation of integral physics benchmark data, validated (or not) analytical methods, and provided assurance of future rector designs The integral experiments performed at Argonne National Laboratory (ANL) represent decades of research performed to support fast reactor design and our understanding of neutronics behavior and reactor physics measurements. Experiments began in 1955 with the Zero Power Reactor No. 3 (ZPR-3) and terminated with the Zero Power Physics Reactormore » (ZPPR, originally the Zero Power Plutonium Reactor) in 1990 at the former ANL-West site in Idaho, which is now part of the Idaho National Laboratory (INL). Two additional critical assemblies, ZPR-6 and ZPR-9, operated at the ANL-East site in Illinois. A total of 128 fast reactor assemblies were constructed with these facilities [1]. The infrastructure and measurement capabilities are too expensive to be replicated in the modern era, making the integral database invaluable as the world pushes ahead with development of liquid metal cooled reactors.« less
Liao, Peilin; Carter, Emily A
2011-09-07
Quantitative characterization of low-lying excited electronic states in materials is critical for the development of solar energy conversion materials. The many-body Green's function method known as the GW approximation (GWA) directly probes states corresponding to photoemission and inverse photoemission experiments, thereby determining the associated band structure. Several versions of the GW approximation with different levels of self-consistency exist in the field. While the GWA based on density functional theory (DFT) works well for conventional semiconductors, less is known about its reliability for strongly correlated semiconducting materials. Here we present a systematic study of the GWA using hematite (α-Fe(2)O(3)) as the benchmark material. We analyze its performance in terms of the calculated photoemission/inverse photoemission band gaps, densities of states, and dielectric functions. Overall, a non-self-consistent G(0)W(0) using input from DFT+U theory produces physical observables in best agreement with experiments. This journal is © the Owner Societies 2011
Seismo-acoustic ray model benchmarking against experimental tank data.
Camargo Rodríguez, Orlando; Collis, Jon M; Simpson, Harry J; Ey, Emanuel; Schneiderwind, Joseph; Felisberto, Paulo
2012-08-01
Acoustic predictions of the recently developed traceo ray model, which accounts for bottom shear properties, are benchmarked against tank experimental data from the EPEE-1 and EPEE-2 (Elastic Parabolic Equation Experiment) experiments. Both experiments are representative of signal propagation in a Pekeris-like shallow-water waveguide over a non-flat isotropic elastic bottom, where significant interaction of the signal with the bottom can be expected. The benchmarks show, in particular, that the ray model can be as accurate as a parabolic approximation model benchmarked in similar conditions. The results of benchmarking are important, on one side, as a preliminary experimental validation of the model and, on the other side, demonstrates the reliability of the ray approach for seismo-acoustic applications.
NASA Astrophysics Data System (ADS)
Carroll, Brandon; Finneran, Ian; Blake, Geoffrey
2014-06-01
We present the design and construction of a simple and low-cost waveguide chirped pulse Fourier transform microwave (CP-FTMW) spectrometer suitable for gas-phase rotational spectroscopy experiments in undergraduate physical chemistry labs as well as graduate level research. The spectrometer operates with modest bandwidth, using phased locked loop (PLL) microwave sources and a direct digital synthesis (DDS) chirp source, making it an affordable for undergraduate labs. The performance of the instrument is benchmarked by acquiring the pure rotational spectrum of the J = 1 - 0 transition OCS and its isotopologues from 11-12.5 GHz.
Dynamic Positioning at Sea Using the Global Positioning System.
1987-06-01
the Global Positioning System (GPS) acquired in Phase II of the Seafloor Benchmark Experiment on R/V Point Sur in August 1986. CPS position...data from the Global Positioning System (GPS) acquired in Phase 11 of the Seafloor Benchmark Experiment on R,:V Point Sur in August 1986. GPS position...The Seafloor Benchmark Experiment, a project of the Hydrographic Sciences Group of the Oceanography Department at the Naval Postgraduate School (NPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Margaret A.
In the early 1970s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an attempt to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950s (HEU-MET-FAST-001). The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared withmore » the GODIVA I experiments. “The very accurate description of this sphere, as assembled, establishes it as an ideal benchmark for calculational methods and cross-section data files” (Reference 1). While performing the ORSphere experiments care was taken to accurately document component dimensions (±0.0001 inches), masses (±0.01 g), and material data. The experiment was also set up to minimize the amount of structural material in the sphere proximity. Two, correlated spheres were evaluated and judged to be acceptable as criticality benchmark experiments. This evaluation is given in HEU-MET-FAST-100. The second, smaller sphere was used for additional reactor physics measurements. Worth measurements (Reference 1, 2, 3 and 4), the delayed neutron fraction (Reference 3, 4 and 5) and surface material worth coefficient (Reference 1 and 2) are all measured and judged to be acceptable as benchmark data. The prompt neutron decay (Reference 6), relative fission density (Reference 7) and relative neutron importance (Reference 7) were measured, but are not evaluated. Information for the evaluation was compiled from References 1 through 7, the experimental logbooks 8 and 9 ; additional drawings and notes provided by the experimenter; and communication with the lead experimenter, John T. Mihalczo.« less
Benchmark Evaluation of Dounreay Prototype Fast Reactor Minor Actinide Depletion Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, J. D.; Gauld, I. C.; Gulliford, J.
2017-01-01
Historic measurements of actinide samples in the Dounreay Prototype Fast Reactor (PFR) are of interest for modern nuclear data and simulation validation. Samples of various higher-actinide isotopes were irradiated for 492 effective full-power days and radiochemically assayed at Oak Ridge National Laboratory (ORNL) and Japan Atomic Energy Research Institute (JAERI). Limited data were available regarding the PFR irradiation; a six-group neutron spectra was available with some power history data to support a burnup depletion analysis validation study. Under the guidance of the Organisation for Economic Co-Operation and Development Nuclear Energy Agency (OECD NEA), the International Reactor Physics Experiment Evaluation Projectmore » (IRPhEP) and Spent Fuel Isotopic Composition (SFCOMPO) Project are collaborating to recover all measurement data pertaining to these measurements, including collaboration with the United Kingdom to obtain pertinent reactor physics design and operational history data. These activities will produce internationally peer-reviewed benchmark data to support validation of minor actinide cross section data and modern neutronic simulation of fast reactors with accompanying fuel cycle activities such as transportation, recycling, storage, and criticality safety.« less
Benchmark problems for numerical implementations of phase field models
Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; ...
2016-10-01
Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verifymore » new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.« less
NASA Astrophysics Data System (ADS)
Pierazzo, E.; Artemieva, N.; Asphaug, E.; Baldwin, E. C.; Cazamias, J.; Coker, R.; Collins, G. S.; Crawford, D. A.; Davison, T.; Elbeshausen, D.; Holsapple, K. A.; Housen, K. R.; Korycansky, D. G.; Wünnemann, K.
2008-12-01
Over the last few decades, rapid improvement of computer capabilities has allowed impact cratering to be modeled with increasing complexity and realism, and has paved the way for a new era of numerical modeling of the impact process, including full, three-dimensional (3D) simulations. When properly benchmarked and validated against observation, computer models offer a powerful tool for understanding the mechanics of impact crater formation. This work presents results from the first phase of a project to benchmark and validate shock codes. A variety of 2D and 3D codes were used in this study, from commercial products like AUTODYN, to codes developed within the scientific community like SOVA, SPH, ZEUS-MP, iSALE, and codes developed at U.S. National Laboratories like CTH, SAGE/RAGE, and ALE3D. Benchmark calculations of shock wave propagation in aluminum-on-aluminum impacts were performed to examine the agreement between codes for simple idealized problems. The benchmark simulations show that variability in code results is to be expected due to differences in the underlying solution algorithm of each code, artificial stability parameters, spatial and temporal resolution, and material models. Overall, the inter-code variability in peak shock pressure as a function of distance is around 10 to 20%. In general, if the impactor is resolved by at least 20 cells across its radius, the underestimation of peak shock pressure due to spatial resolution is less than 10%. In addition to the benchmark tests, three validation tests were performed to examine the ability of the codes to reproduce the time evolution of crater radius and depth observed in vertical laboratory impacts in water and two well-characterized aluminum alloys. Results from these calculations are in good agreement with experiments. There appears to be a general tendency of shock physics codes to underestimate the radius of the forming crater. Overall, the discrepancy between the model and experiment results is between 10 and 20%, similar to the inter-code variability.
QED Tests and Search for New Physics in Molecular Hydrogen
NASA Astrophysics Data System (ADS)
Salumbides, E. J.; Niu, M. L.; Dickenson, G. D.; Eikema, K. S. E.; Komasa, J.; Pachucki, K.; Ubachs, W.
2013-06-01
The hydrogen molecule has been the benchmark system for quantum chemistry, and may provide a test ground for new physics. We present our high-resolution spectroscopic studies on the X ^1Σ^+_g electronic ground state rotational series and fundamenal vibrational tones in molecular hydrogen. In combination with recent accurate ab initio calculations, we demonstrate systematic tests of quantum electrodynamical (QED) effects in molecules. Moreover, the precise comparison between theory and experiment can provide stringent constraints on possible new interactions that extend beyond the Standard Model. E. J. Salumbides, G. D. Dickenson, T. I. Ivanov and W. Ubachs, Phys. Rev. Lett. 107, 043005 (2011).
Benchmark Computation and Finite Element Performance Evaluation for a Rhombic Plate Bending Problem
1987-09-01
Physical Science and Technology University of Maryland, College Park, MD 20742, USA and Dip. Matematica - Universita di Pavia - 27100 Pavia - ITALY DTIC...University of Maryland, College Park,, MD 20742, USA , and Dip. Matematica - Universita di Pavia - 27100 Pavia - ITALY SFor Oe" -- 4- I , CA& 11 --l...drawn when based on the state of the art of both theoretical and experience field. The reliability has to be understood not only with respect to a
Results from the NOvA Experiment
NASA Astrophysics Data System (ADS)
Smith, Erica
The NOvA experiment is a long-baseline accelerator-based neutrino oscillation experiment. It uses the upgraded NuMI beam from Fermilab to measure electron-neutrino appearance and muon-neutrino disappearance between the Near Detector, located at Fermilab, and the Far Detector, located at Ash River, Minnesota. The NuMI beam has recently reached and surpassed the 700 kW power benchmark. NOvA’s primary physics goals include precision measurements of oscillation parameters, such as 𝜃23 and the atmospheric mass-squared splitting, along with probes of the mass hierarchy and of the CP violating phase. This talk will present the latest NOvA results, based on a neutrino beam exposure equivalent to 6.05 × 1020 protons-on-target.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHart, Mark D.; Mausolff, Zander; Weems, Zach
2016-08-01
One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\cite{Rattlesnake} and the fuels performance code BISON. Other validation projects outsidemore » of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.« less
Uncertainty Quantification Techniques of SCALE/TSUNAMI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don
2011-01-01
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson; ...
2018-06-14
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer; Bahran, Rian; Hutchinson, Jesson
Historically, radiation transport codes have uncorrelated fission emissions. In reality, the particles emitted by both spontaneous and induced fissions are correlated in time, energy, angle, and multiplicity. This work validates the performance of various current Monte Carlo codes that take into account the underlying correlated physics of fission neutrons, specifically neutron multiplicity distributions. The performance of 4 Monte Carlo codes - MCNP®6.2, MCNP®6.2/FREYA, MCNP®6.2/CGMF, and PoliMi - was assessed using neutron multiplicity benchmark experiments. In addition, MCNP®6.2 simulations were run using JEFF-3.2 and JENDL-4.0, rather than ENDF/B-VII.1, data for 239Pu and 240Pu. The sensitive benchmark parameters that in this workmore » represent the performance of each correlated fission multiplicity Monte Carlo code include the singles rate, the doubles rate, leakage multiplication, and Feynman histograms. Although it is difficult to determine which radiation transport code shows the best overall performance in simulating subcritical neutron multiplication inference benchmark measurements, it is clear that correlations exist between the underlying nuclear data utilized by (or generated by) the various codes, and the correlated neutron observables of interest. This could prove useful in nuclear data validation and evaluation applications, in which a particular moment of the neutron multiplicity distribution is of more interest than the other moments. It is also quite clear that, because transport is handled by MCNP®6.2 in 3 of the 4 codes, with the 4th code (PoliMi) being based on an older version of MCNP®, the differences in correlated neutron observables of interest are most likely due to the treatment of fission event generation in each of the different codes, as opposed to the radiation transport.« less
ERIC Educational Resources Information Center
Ossiannilsson, E.; Landgren, L.
2012-01-01
Between 2008 and 2010, Lund University took part in three international benchmarking projects, "E-xcellence+," the "eLearning Benchmarking Exercise 2009," and the "First Dual-Mode Distance Learning Benchmarking Club." A comparison of these models revealed a rather high level of correspondence. From this finding and…
Benchmarking study of the MCNP code against cold critical experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, S.
1991-01-01
The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less
Mo-Si-B Alloys and Diboride Systems for High Enthalpy Environments: Design and Evaluation
2016-01-15
candidate material species production over a range of test gas enthalpies and pressures for UWM and ISU samples. Year 3: 3.1 Begin FTIR...emission measurements on CO2-laser heated samples at SRI. 3.2 Continue experiments to optimize Si-, B-, and C-species LIF detection schemes in hot gas ...material tests to identify data that can be used to benchmark development of physics-based models of gas -surface interactions. • Employ the
Flame-Vortex Interactions in Microgravity to Improve Models of Turbulent Combustion
NASA Technical Reports Server (NTRS)
Driscoll, James F.
1999-01-01
A unique flame-vortex interaction experiment is being operated in microgravity in order to obtain fundamental data to assess the Theory of Flame Stretch which will be used to improve models of turbulent combustion. The experiment provides visual images of the physical process by which an individual eddy in a turbulent flow increases the flame surface area, changes the local flame propagation speed, and can extinguish the reaction. The high quality microgravity images provide benchmark data that are free from buoyancy effects. Results are used to assess Direct Numerical Simulations of Dr. K. Kailasanath at NRL, which were run for the same conditions.
Beauchamp, Kyle A; Behr, Julie M; Rustenburg, Ariën S; Bayly, Christopher I; Kroenlein, Kenneth; Chodera, John D
2015-10-08
Atomistic molecular simulations are a powerful way to make quantitative predictions, but the accuracy of these predictions depends entirely on the quality of the force field employed. Although experimental measurements of fundamental physical properties offer a straightforward approach for evaluating force field quality, the bulk of this information has been tied up in formats that are not machine-readable. Compiling benchmark data sets of physical properties from non-machine-readable sources requires substantial human effort and is prone to the accumulation of human errors, hindering the development of reproducible benchmarks of force-field accuracy. Here, we examine the feasibility of benchmarking atomistic force fields against the NIST ThermoML data archive of physicochemical measurements, which aggregates thousands of experimental measurements in a portable, machine-readable, self-annotating IUPAC-standard format. As a proof of concept, we present a detailed benchmark of the generalized Amber small-molecule force field (GAFF) using the AM1-BCC charge model against experimental measurements (specifically, bulk liquid densities and static dielectric constants at ambient pressure) automatically extracted from the archive and discuss the extent of data available for use in larger scale (or continuously performed) benchmarks. The results of even this limited initial benchmark highlight a general problem with fixed-charge force fields in the representation low-dielectric environments, such as those seen in binding cavities or biological membranes.
Benchmarking of HEU Mental Annuli Critical Assemblies with Internally Reflected Graphite Cylinder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiaobo, Liu; Bess, John D.; Marshall, Margaret A.
Three experimental configurations of critical assemblies, performed in 1963 at the Oak Ridge Critical Experiment Facility, which are assembled using three different diameter HEU annuli (15-9 inches, 15-7 inches and 13-7 inches) metal annuli with internally reflected graphite cylinder are evaluated and benchmarked. The experimental uncertainties which are 0.00055, 0.00055 and 0.00055 respectively, and biases to the detailed benchmark models which are -0.00179, -0.00189 and -0.00114 respectively, were determined, and the experimental benchmark keff results were obtained for both detailed and simplified model. The calculation results for both detailed and simplified models using MCNP6-1.0 and ENDF VII.1 agree well tomore » the benchmark experimental results with a difference of less than 0.2%. These are acceptable benchmark experiments for inclusion in the ICSBEP Handbook.« less
High-Strength Composite Fabric Tested at Structural Benchmark Test Facility
NASA Technical Reports Server (NTRS)
Krause, David L.
2002-01-01
Large sheets of ultrahigh strength fabric were put to the test at NASA Glenn Research Center's Structural Benchmark Test Facility. The material was stretched like a snare drum head until the last ounce of strength was reached, when it burst with a cacophonous release of tension. Along the way, the 3-ft square samples were also pulled, warped, tweaked, pinched, and yanked to predict the material's physical reactions to the many loads that it will experience during its proposed use. The material tested was a unique multi-ply composite fabric, reinforced with fibers that had a tensile strength eight times that of common carbon steel. The fiber plies were oriented at 0 and 90 to provide great membrane stiffness, as well as oriented at 45 to provide an unusually high resistance to shear distortion. The fabric's heritage is in astronaut space suits and other NASA programs.
Integrating CFD, CAA, and Experiments Towards Benchmark Datasets for Airframe Noise Problems
NASA Technical Reports Server (NTRS)
Choudhari, Meelan M.; Yamamoto, Kazuomi
2012-01-01
Airframe noise corresponds to the acoustic radiation due to turbulent flow in the vicinity of airframe components such as high-lift devices and landing gears. The combination of geometric complexity, high Reynolds number turbulence, multiple regions of separation, and a strong coupling with adjacent physical components makes the problem of airframe noise highly challenging. Since 2010, the American Institute of Aeronautics and Astronautics has organized an ongoing series of workshops devoted to Benchmark Problems for Airframe Noise Computations (BANC). The BANC workshops are aimed at enabling a systematic progress in the understanding and high-fidelity predictions of airframe noise via collaborative investigations that integrate state of the art computational fluid dynamics, computational aeroacoustics, and in depth, holistic, and multifacility measurements targeting a selected set of canonical yet realistic configurations. This paper provides a brief summary of the BANC effort, including its technical objectives, strategy, and selective outcomes thus far.
Symmetrical and overloaded effect of diffusion in information filtering
NASA Astrophysics Data System (ADS)
Zhu, Xuzhen; Tian, Hui; Chen, Guilin; Cai, Shimin
2017-10-01
In physical dynamics, mass diffusion theory has been applied to design effective information filtering models on bipartite network. In previous works, researchers unilaterally believe objects' similarities are determined by single directional mass diffusion from the collected object to the uncollected, meanwhile, inadvertently ignore adverse influence of diffusion overload. It in some extent veils the essence of diffusion in physical dynamics and hurts the recommendation accuracy and diversity. After delicate investigation, we argue that symmetrical diffusion effectively discloses essence of mass diffusion, and high diffusion overload should be published. Accordingly, in this paper, we propose an symmetrical and overload penalized diffusion based model (SOPD), which shows excellent performances in extensive experiments on benchmark datasets Movielens and Netflix.
Nuclear Data Needs for Generation IV Nuclear Energy Systems
NASA Astrophysics Data System (ADS)
Rullhusen, Peter
2006-04-01
Nuclear data needs for generation IV systems. Future of nuclear energy and the role of nuclear data / P. Finck. Nuclear data needs for generation IV nuclear energy systems-summary of U.S. workshop / T. A. Taiwo, H. S. Khalil. Nuclear data needs for the assessment of gen. IV systems / G. Rimpault. Nuclear data needs for generation IV-lessons from benchmarks / S. C. van der Marck, A. Hogenbirk, M. C. Duijvestijn. Core design issues of the supercritical water fast reactor / M. Mori ... [et al.]. GFR core neutronics studies at CEA / J. C. Bosq ... [et al]. Comparative study on different phonon frequency spectra of graphite in GCR / Young-Sik Cho ... [et al.]. Innovative fuel types for minor actinides transmutation / D. Haas, A. Fernandez, J. Somers. The importance of nuclear data in modeling and designing generation IV fast reactors / K. D. Weaver. The GIF and Mexico-"everything is possible" / C. Arrenondo Sánchez -- Benmarks, sensitivity calculations, uncertainties. Sensitivity of advanced reactor and fuel cycle performance parameters to nuclear data uncertainties / G. Aliberti ... [et al.]. Sensitivity and uncertainty study for thermal molten salt reactors / A. Biduad ... [et al.]. Integral reactor physics benchmarks- The International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPHEP) / J. B. Briggs, D. W. Nigg, E. Sartori. Computer model of an error propagation through micro-campaign of fast neutron gas cooled nuclear reactor / E. Ivanov. Combining differential and integral experiments on [symbol] for reducing uncertainties in nuclear data applications / T. Kawano ... [et al.]. Sensitivity of activation cross sections of the Hafnium, Tanatalum and Tungsten stable isotopes to nuclear reaction mechanisms / V. Avrigeanu ... [et al.]. Generating covariance data with nuclear models / A. J. Koning. Sensitivity of Candu-SCWR reactors physics calculations to nuclear data files / K. S. Kozier, G. R. Dyck. The lead cooled fast reactor benchmark BREST-300: analysis with sensitivity method / V. Smirnov ... [et al.]. Sensitivity analysis of neutron cross-sections considered for design and safety studies of LFR and SFR generation IV systems / K. Tucek, J. Carlsson, H. Wider -- Experiments. INL capabilities for nuclear data measurements using the Argonne intense pulsed neutron source facility / J. D. Cole ... [et al.]. Cross-section measurements in the fast neutron energy range / A. Plompen. Recent measurements of neutron capture cross sections for minor actinides by a JNC and Kyoto University Group / H. Harada ... [et al.]. Determination of minor actinides fission cross sections by means of transfer reactions / M. Aiche ... [et al.] -- Evaluated data libraries. Nuclear data services from the NEA / H. Henriksson, Y. Rugama. Nuclear databases for energy applications: an IAEA perspective / R. Capote Noy, A. L. Nichols, A. Trkov. Nuclear data evaluation for generation IV / G. Noguère ... [et al.]. Improved evaluations of neutron-induced reactions on americium isotopes / P. Talou ... [et al.]. Using improved ENDF-based nuclear data for candu reactor calculations / J. Prodea. A comparative study on the graphite-moderated reactors using different evaluated nuclear data / Do Heon Kim ... [et al.].
Diagnostic and procedural imaging curricula in physical therapist professional degree programs.
Boissonnault, William G; White, Douglas M; Carney, Sara; Malin, Brittany; Smith, Wayne
2014-08-01
Descriptive survey. To describe the status of diagnostic and procedural imaging curricula within United States physical therapist professional degree programs. As patient direct access to physical therapy services increases, the ability to refer patients directly for diagnostic imaging could promote more efficient delivery of care. Appropriate patient referral is contingent on physical therapists having the requisite knowledge base and skills. While evidence describing imaging competence of physical therapists with advanced training in military institutions exists, evidence is lacking for other physical therapists, including new graduates of physical therapist professional degree programs. Faculty members teaching imaging at 206 United States physical therapist professional degree programs recognized by the Commission on Accreditation in Physical Therapy Education were recruited via e-mail correspondence. An e-mail attachment included the survey on which faculty reported imaging curricula and faculty qualifications, attitudes, and experiences. Faculty from 155 (75.2%) programs responded to the survey, with imaging being included in the curriculum of 152 programs. Content was integrated by required standalone courses or clinical science track courses, and/or through elective courses. The average reported estimate of imaging contact hours was 24.4 hours (range, 2-75 hours). Emphasis was on the musculoskeletal system, including 76.3% of the required standalone course content. Student competence was assessed in 147 (96.7%) programs, primarily by written (66.7%) and practical (19.7%) examinations. Faculty rated student competence on a scale of 1 (not competent) to 5 (competent), with ratings ranging from a high of 4.0 (identifying normal anatomy on plain-film radiography) to a low of 1.9 (identifying common tissue pathological processes/injuries on ultrasound). While a majority of programs reported including imaging curricula, variability was noted in all curricular aspects. These results may serve as a benchmark for faculty to assess existing curricula, allow for further development of imaging curricula, and provide a benchmark for the profession regarding current level of training for recent graduates of entry-level physical therapist professional degree programs.
Facility Benchmarking Trends in Tertiary Education - An Australian Case Study.
ERIC Educational Resources Information Center
Fisher, Kenn
2001-01-01
Presents how Australia's facility managers are responding to the growing impact of tertiary education participation and the increase in educational facility usage. Topics cover strategic asset management and the benchmarking of education physical assets and postsecondary institutions. (GR)
Benchmarking Multilayer-HySEA model for landslide generated tsunami. HTHMP validation process.
NASA Astrophysics Data System (ADS)
Macias, J.; Escalante, C.; Castro, M. J.
2017-12-01
Landslide tsunami hazard may be dominant along significant parts of the coastline around the world, in particular in the USA, as compared to hazards from other tsunamigenic sources. This fact motivated NTHMP about the need of benchmarking models for landslide generated tsunamis, following the same methodology already used for standard tsunami models when the source is seismic. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory data sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. A total of 7 benchmarks. The Multilayer-HySEA model including non-hydrostatic effects has been used to perform all the benchmarking problems dealing with laboratory experiments proposed in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017 by NTHMP. The aim of this presentation is to show some of the latest numerical results obtained with the Multilayer-HySEA (non-hydrostatic) model in the framework of this validation effort.Acknowledgements. This research has been partially supported by the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and University of Malaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).
Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A
2011-01-01
The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less
A novel comparison of Møller and Compton electron-beam polarimeters
Magee, J. A.; Narayan, A.; Jones, D.; ...
2017-01-19
We have performed a novel comparison between electron-beam polarimeters based on Moller and Compton scattering. A sequence of electron-beam polarization measurements were performed at low beam currents (more » $<$ 5 $$\\mu$$A) during the $$Q_{\\rm weak}$$ experiment in Hall C at Jefferson Lab. These low current measurements were bracketed by the regular high current (180 $$\\mu$$A) operation of the Compton polarimeter. All measurements were found to be consistent within experimental uncertainties of 1% or less, demonstrating that electron polarization does not depend significantly on the beam current. This result lends confidence to the common practice of applying Moller measurements made at low beam currents to physics experiments performed at higher beam currents. Here, the agreement between two polarimetry techniques based on independent physical processes sets an important benchmark for future precision asymmetry measurements that require sub-1% precision in polarimetry.« less
A novel comparison of Møller and Compton electron-beam polarimeters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magee, J. A.; Narayan, A.; Jones, D.
We have performed a novel comparison between electron-beam polarimeters based on Moller and Compton scattering. A sequence of electron-beam polarization measurements were performed at low beam currents (more » $<$ 5 $$\\mu$$A) during the $$Q_{\\rm weak}$$ experiment in Hall C at Jefferson Lab. These low current measurements were bracketed by the regular high current (180 $$\\mu$$A) operation of the Compton polarimeter. All measurements were found to be consistent within experimental uncertainties of 1% or less, demonstrating that electron polarization does not depend significantly on the beam current. This result lends confidence to the common practice of applying Moller measurements made at low beam currents to physics experiments performed at higher beam currents. Here, the agreement between two polarimetry techniques based on independent physical processes sets an important benchmark for future precision asymmetry measurements that require sub-1% precision in polarimetry.« less
The art and science of using routine outcome measurement in mental health benchmarking.
McKay, Roderick; Coombs, Tim; Duerden, David
2014-02-01
To report and critique the application of routine outcome measurement data when benchmarking Australian mental health services. The experience of the authors as participants and facilitators of benchmarking activities is augmented by a review of the literature regarding mental health benchmarking in Australia. Although the published literature is limited, in practice, routine outcome measures, in particular the Health of the National Outcomes Scales (HoNOS) family of measures, are used in a variety of benchmarking activities. Use in exploring similarities and differences in consumers between services and the outcomes of care are illustrated. This requires the rigour of science in data management and interpretation, supplemented by the art that comes from clinical experience, a desire to reflect on clinical practice and the flexibility to use incomplete data to explore clinical practice. Routine outcome measurement data can be used in a variety of ways to support mental health benchmarking. With the increasing sophistication of information development in mental health, the opportunity to become involved in benchmarking will continue to increase. The techniques used during benchmarking and the insights gathered may prove useful to support reflection on practice by psychiatrists and other senior mental health clinicians.
Evolvable mathematical models: A new artificial Intelligence paradigm
NASA Astrophysics Data System (ADS)
Grouchy, Paul
We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
'Wasteaware' benchmark indicators for integrated sustainable waste management in cities.
Wilson, David C; Rodic, Ljiljana; Cowing, Michael J; Velis, Costas A; Whiteman, Andrew D; Scheinberg, Anne; Vilches, Recaredo; Masterson, Darragh; Stretz, Joachim; Oelz, Barbara
2015-01-01
This paper addresses a major problem in international solid waste management, which is twofold: a lack of data, and a lack of consistent data to allow comparison between cities. The paper presents an indicator set for integrated sustainable waste management (ISWM) in cities both North and South, to allow benchmarking of a city's performance, comparing cities and monitoring developments over time. It builds on pioneering work for UN-Habitat's solid waste management in the World's cities. The comprehensive analytical framework of a city's solid waste management system is divided into two overlapping 'triangles' - one comprising the three physical components, i.e. collection, recycling, and disposal, and the other comprising three governance aspects, i.e. inclusivity; financial sustainability; and sound institutions and proactive policies. The indicator set includes essential quantitative indicators as well as qualitative composite indicators. This updated and revised 'Wasteaware' set of ISWM benchmark indicators is the cumulative result of testing various prototypes in more than 50 cities around the world. This experience confirms the utility of indicators in allowing comprehensive performance measurement and comparison of both 'hard' physical components and 'soft' governance aspects; and in prioritising 'next steps' in developing a city's solid waste management system, by identifying both local strengths that can be built on and weak points to be addressed. The Wasteaware ISWM indicators are applicable to a broad range of cities with very different levels of income and solid waste management practices. Their wide application as a standard methodology will help to fill the historical data gap. Copyright © 2014 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-06
... and facilitate the use of documentation in future evaluations and benchmarking. Extraordinary.... Benchmarking Other Agencies' Experiences A Federal agency cannot rely on another agency's categorical exclusion... was established. Federal agencies can also substantiate categorical exclusions by benchmarking, or...
NASA Astrophysics Data System (ADS)
Murata, Isao; Ohta, Masayuki; Miyamaru, Hiroyuki; Kondo, Keitaro; Yoshida, Shigeo; Iida, Toshiyuki; Ochiai, Kentaro; Konno, Chikara
2011-10-01
Nuclear data are indispensable for development of fusion reactor candidate materials. However, benchmarking of the nuclear data in MeV energy region is not yet adequate. In the present study, benchmark performance in the MeV energy region was investigated theoretically for experiments by using a 14 MeV neutron source. We carried out a systematical analysis for light to heavy materials. As a result, the benchmark performance for the neutron spectrum was confirmed to be acceptable, while for gamma-rays it was not sufficiently accurate. Consequently, a spectrum shifter has to be applied. Beryllium had the best performance as a shifter. Moreover, a preliminary examination of whether it is really acceptable that only the spectrum before the last collision is considered in the benchmark performance analysis. It was pointed out that not only the last collision but also earlier collisions should be considered equally in the benchmark performance analysis.
Reflections of Educators in Pursuit of Inclusive Science Classrooms
NASA Astrophysics Data System (ADS)
Kirch, Susan A.; Bargerhuff, Mary Ellen; Cowan, Heidi; Wheatly, Michele
2007-08-01
General education science teachers are meeting increasingly diverse classrooms of students that include students with disabilities. A one-week, summer, residential workshop was offered to interested science and special educators who worked through lab experiments one-on-one with students with physical or sensory disabilities (grades 7-12). To determine how effective this professional development workshop was at raising disability awareness and providing teacher training in inclusive science teaching practices, a combination of survey and reflective journal entries was used to monitor participants’ experience. Here we discuss the findings from this benchmark study and discuss how others might adapt this professional development model for use by schools interested in moving toward inclusive practices.
Beyond core count: a look at new mainstream computing platforms for HEP workloads
NASA Astrophysics Data System (ADS)
Szostek, P.; Nowak, A.; Bitzes, G.; Valsan, L.; Jarp, S.; Dotti, A.
2014-06-01
As Moore's Law continues to deliver more and more transistors, the mainstream processor industry is preparing to expand its investments in areas other than simple core count. These new interests include deep integration of on-chip components, advanced vector units, memory, cache and interconnect technologies. We examine these moving trends with parallelized and vectorized High Energy Physics workloads in mind. In particular, we report on practical experience resulting from experiments with scalable HEP benchmarks on the Intel "Ivy Bridge-EP" and "Haswell" processor families. In addition, we examine the benefits of the new "Haswell" microarchitecture and its impact on multiple facets of HEP software. Finally, we report on the power efficiency of new systems.
Modification and benchmarking of MCNP for low-energy tungsten spectra.
Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M
2000-12-01
The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.
Operation and performance of the LHCb calorimeters
NASA Astrophysics Data System (ADS)
Chefdeville, M.
2018-03-01
The LHCb calorimeters play a key role in the hardware trigger of the experiment. They also serve the measurement of radiative heavy flavor decays and the identification of electrons. Located at twelve meters from the interaction region, they are composed of a plane of scintillating tiles, a preshower detector, an electromagnetic and a hadronic sampling calorimeters using scintillators as active elements. In these proceedings, technical and operational aspects of these detectors are described. Emphasis is then put on calorimeter reconstruction and calibration. Finally, performance for benchmark physics modes are briefly reported.
Ellis, Judith
2006-07-01
The aim of this article is to review published descriptions of benchmarking activity and synthesize benchmarking principles to encourage the acceptance and use of Essence of Care as a new benchmarking approach to continuous quality improvement, and to promote its acceptance as an integral and effective part of benchmarking activity in health services. The Essence of Care, was launched by the Department of Health in England in 2001 to provide a benchmarking tool kit to support continuous improvement in the quality of fundamental aspects of health care, for example, privacy and dignity, nutrition and hygiene. The tool kit is now being effectively used by some frontline staff. However, use is inconsistent, with the value of the tool kit, or the support clinical practice benchmarking requires to be effective, not always recognized or provided by National Health Service managers, who are absorbed with the use of quantitative benchmarking approaches and measurability of comparative performance data. This review of published benchmarking literature, was obtained through an ever-narrowing search strategy commencing from benchmarking within quality improvement literature through to benchmarking activity in health services and including access to not only published examples of benchmarking approaches and models used but the actual consideration of web-based benchmarking data. This supported identification of how benchmarking approaches have developed and been used, remaining true to the basic benchmarking principles of continuous improvement through comparison and sharing (Camp 1989). Descriptions of models and exemplars of quantitative and specifically performance benchmarking activity in industry abound (Camp 1998), with far fewer examples of more qualitative and process benchmarking approaches in use in the public services and then applied to the health service (Bullivant 1998). The literature is also in the main descriptive in its support of the effectiveness of benchmarking activity and although this does not seem to have restricted its popularity in quantitative activity, reticence about the value of the more qualitative approaches, for example Essence of Care, needs to be overcome in order to improve the quality of patient care and experiences. The perceived immeasurability and subjectivity of Essence of Care and clinical practice benchmarks means that these benchmarking approaches are not always accepted or supported by health service organizations as valid benchmarking activity. In conclusion, Essence of Care benchmarking is a sophisticated clinical practice benchmarking approach which needs to be accepted as an integral part of health service benchmarking activity to support improvement in the quality of patient care and experiences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Margaret A.; Bess, John D.
2015-02-01
The critical configuration of the small, compact critical assembly (SCCA) experiments performed at the Oak Ridge Critical Experiments Facility (ORCEF) in 1962-1965 have been evaluated as acceptable benchmark experiments for inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. The initial intent of these experiments was to support the design of the Medium Power Reactor Experiment (MPRE) program, whose purpose was to study “power plants for the production of electrical power in space vehicles.” The third configuration in this series of experiments was a beryllium-reflected assembly of stainless-steel-clad, highly enriched uranium (HEU)-O 2 fuel mockup of a potassium-cooledmore » space power reactor. Reactivity measurements cadmium ratio spectral measurements and fission rate measurements were measured through the core and top reflector. Fuel effect worth measurements and neutron moderating and absorbing material worths were also measured in the assembly fuel region. The cadmium ratios, fission rate, and worth measurements were evaluated for inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. The fuel tube effect and neutron moderating and absorbing material worth measurements are the focus of this paper. Additionally, a measurement of the worth of potassium filling the core region was performed but has not yet been evaluated Pellets of 93.15 wt.% enriched uranium dioxide (UO 2) were stacked in 30.48 cm tall stainless steel fuel tubes (0.3 cm tall end caps). Each fuel tube had 26 pellets with a total mass of 295.8 g UO 2 per tube. 253 tubes were arranged in 1.506-cm triangular lattice. An additional 7-tube cluster critical configuration was also measured but not used for any physics measurements. The core was surrounded on all side by a beryllium reflector. The fuel effect worths were measured by removing fuel tubes at various radius. An accident scenario was also simulated by moving outward twenty fuel rods from the periphery of the core so they were touching the core tank. The change in the system reactivity when the fuel tube(s) were removed/moved compared with the base configuration was the worth of the fuel tubes or accident scenario. The worth of neutron absorbing and moderating materials was measured by inserting material rods into the core at regular intervals or placing lids at the top of the core tank. Stainless steel 347, tungsten, niobium, polyethylene, graphite, boron carbide, aluminum and cadmium rods and/or lid worths were all measured. The change in the system reactivity when a material was inserted into the core is the worth of the material.« less
Experimental physics characteristics of a heavy-metal-reflected fast-spectrum critical assembly
NASA Technical Reports Server (NTRS)
Heneveld, W. H.; Paschall, R. K.; Springer, T. H.; Swanson, V. A.; Thiele, A. W.; Tuttle, R. J.
1971-01-01
A zero-power critical assembly was designed, constructed, and operated for the purpose of conducting a series of benchmark experiments dealing with the physics characteristics of a UN-fueled, Li-7 cooled, Mo-reflected, drum-controlled compact fast reactor for use with a space-power electric conversion system. The experimental program consisted basically of measuring the differential neutron spectra and the changes in critical mass that accompanied the stepwise addition of (Li-7)3N, Hf, Ta, and W to a basic core fueled with U metal in a pin-type Ta honeycomb structure. In addition, experimental results were obtained on power distributions, control characteristics, neutron lifetime, and reactivity worths of numerous absorber, structural, and scattering materials.
Renner, Franziska
2016-09-01
Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.
Object-Oriented Implementation of the NAS Parallel Benchmarks using Charm++
NASA Technical Reports Server (NTRS)
Krishnan, Sanjeev; Bhandarkar, Milind; Kale, Laxmikant V.
1996-01-01
This report describes experiences with implementing the NAS Computational Fluid Dynamics benchmarks using a parallel object-oriented language, Charm++. Our main objective in implementing the NAS CFD kernel benchmarks was to develop a code that could be used to easily experiment with different domain decomposition strategies and dynamic load balancing. We also wished to leverage the object-orientation provided by the Charm++ parallel object-oriented language, to develop reusable abstractions that would simplify the process of developing parallel applications. We first describe the Charm++ parallel programming model and the parallel object array abstraction, then go into detail about each of the Scalar Pentadiagonal (SP) and Lower/Upper Triangular (LU) benchmarks, along with performance results. Finally we conclude with an evaluation of the methodology used.
NASA Astrophysics Data System (ADS)
Jonas, S.; Murtagh, W. J.; Clarke, S. W.
2017-12-01
The National Space Weather Action Plan identifies approximately 100 distinct activities across six strategic goals. Many of these activities depend on the identification of a series of benchmarks that describe the physical characteristics of space weather events on or near Earth. My talk will provide an overview of Goal 1 (Establish Benchmarks for Space-Weather Events) of the National Space Weather Action Plan which will provide an introduction to the panel presentations and discussions.
Deterministic Modeling of the High Temperature Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, J.; Cogliati, J. J.; Pope, M. A.
2010-06-01
Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine INL’s current prismatic reactor deterministic analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 column thin annular core, and the fully loaded core critical condition with 30 columns. Special emphasis is devoted to the annular core modeling, which shares more characteristics with the NGNP base design. The DRAGON code is usedmore » in this study because it offers significant ease and versatility in modeling prismatic designs. Despite some geometric limitations, the code performs quite well compared to other lattice physics codes. DRAGON can generate transport solutions via collision probability (CP), method of characteristics (MOC), and discrete ordinates (Sn). A fine group cross section library based on the SHEM 281 energy structure is used in the DRAGON calculations. HEXPEDITE is the hexagonal z full core solver used in this study and is based on the Green’s Function solution of the transverse integrated equations. In addition, two Monte Carlo (MC) based codes, MCNP5 and PSG2/SERPENT, provide benchmarking capability for the DRAGON and the nodal diffusion solver codes. The results from this study show a consistent bias of 2–3% for the core multiplication factor. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement stems from the fact that during the experiments the control rods were adjusted to maintain criticality, whereas in the model, the rod positions were fixed. In addition, this work includes a brief study of a cross section generation approach that seeks to decouple the domain in order to account for neighbor effects. This spectral interpenetration is a dominant effect in annular HTR physics. This analysis methodology should be further explored in order to reduce the error that is systematically propagated in the traditional generation of cross sections.« less
Benchmarking in Czech Higher Education: The Case of Schools of Economics
ERIC Educational Resources Information Center
Placek, Michal; Ochrana, František; Pucek, Milan
2015-01-01
This article describes the use of benchmarking in universities in the Czech Republic and academics' experiences with it. It is based on research conducted among academics from economics schools in Czech public and private universities. The results identified several issues regarding the utilisation and understanding of benchmarking in the Czech…
Status of BOUT fluid turbulence code: improvements and verification
NASA Astrophysics Data System (ADS)
Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.
2006-10-01
BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.
Geant4 Computing Performance Benchmarking and Monitoring
Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...
2015-12-23
Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less
The Schultz MIDI Benchmarking Toolbox for MIDI interfaces, percussion pads, and sound cards.
Schultz, Benjamin G
2018-04-17
The Musical Instrument Digital Interface (MIDI) was readily adopted for auditory sensorimotor synchronization experiments. These experiments typically use MIDI percussion pads to collect responses, a MIDI-USB converter (or MIDI-PCI interface) to record responses on a PC and manipulate feedback, and an external MIDI sound module to generate auditory feedback. Previous studies have suggested that auditory feedback latencies can be introduced by these devices. The Schultz MIDI Benchmarking Toolbox (SMIDIBT) is an open-source, Arduino-based package designed to measure the point-to-point latencies incurred by several devices used in the generation of response-triggered auditory feedback. Experiment 1 showed that MIDI messages are sent and received within 1 ms (on average) in the absence of any external MIDI device. Latencies decreased when the baud rate increased above the MIDI protocol default (31,250 bps). Experiment 2 benchmarked the latencies introduced by different MIDI-USB and MIDI-PCI interfaces. MIDI-PCI was superior to MIDI-USB, primarily because MIDI-USB is subject to USB polling. Experiment 3 tested three MIDI percussion pads. Both the audio and MIDI message latencies were significantly greater than 1 ms for all devices, and there were significant differences between percussion pads and instrument patches. Experiment 4 benchmarked four MIDI sound modules. Audio latencies were significantly greater than 1 ms, and there were significant differences between sound modules and instrument patches. These experiments suggest that millisecond accuracy might not be achievable with MIDI devices. The SMIDIBT can be used to benchmark a range of MIDI devices, thus allowing researchers to make informed decisions when choosing testing materials and to arrive at an acceptable latency at their discretion.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
Kirkwood, R. K.; Michel, P.; London, R.; ...
2011-05-26
To optimize the coupling to indirect drive targets in the National Ignition Campaign (NIC) at the National Ignition Facility, a model of stimulated scattering produced by multiple laser beams is used. The model has shown that scatter of the 351 nm beams can be significantly enhanced over single beam predictions in ignition relevant targets by the interaction of the multiple crossing beams with a millimeter scale length, 2.5 keV, 0.02 - 0.05 x critical density, plasma. The model uses a suite of simulation capabilities and its key aspects are benchmarked with experiments at smaller laser facilities. The model has alsomore » influenced the design of the initial targets used for NIC by showing that both the stimulated Brillouin scattering (SBS) and stimulated Raman scattering (SRS) can be reduced by the reduction of the plasma density in the beam intersection volume that is caused by an increase in the diameter of the laser entrance hole (LEH). In this model, a linear wave response leads to a small gain exponent produced by each crossing quad of beams (<~1 per quad) which amplifies the scattering that originates in the target interior where the individual beams are separated and crosses many or all other beams near the LEH as it exits the target. As a result all 23 crossing quads of beams produce a total gain exponent of several or greater for seeds of light with wavelengths in the range that is expected for scattering from the interior (480 to 580 nm for SRS). This means that in the absence of wave saturation, the overall multi-beam scatter will be significantly larger than the expectations for single beams. The potential for non-linear saturation of the Langmuir waves amplifying SRS light is also analyzed with a two dimensional, vectorized, particle in cell code (2D VPIC) that is benchmarked by amplification experiments in a plasma with normalized parameters similar to ignition targets. The physics of cumulative scattering by multiple crossing beams that simultaneously amplify the same SBS light wave is further demonstrated in experiments that benchmark the linear models for the ion waves amplifying SBS. Here, the expectation from this model and its experimental benchmarks is shown to be consistent with observations of stimulated Raman scatter in the first series of energetic experiments with ignition targets, confirming the importance of the multi-beam scattering model for optimizing coupling.« less
NASA Astrophysics Data System (ADS)
Schmidt-Bocking, Horst
2008-05-01
The correlated many-particle dynamics in Coulombic systems, which is one of the unsolved fundamental problems in AMO-physics, can now be experimentally approached with so far unprecedented completeness and precision. The recent development of the COLTRIMS technique (COLd Target Recoil Ion Momentum Spectroscopy) provides a coincident multi-fragment imaging technique for eV and sub-eV fragment detection. In its completeness it is as powerful as the bubble chamber in high energy physics. In recent benchmark experiments quasi snapshots (duration as short as an atto-sec) of the correlated dynamics between electrons and nuclei has been made for atomic and molecular objects. This new imaging technique has opened a powerful observation window into the hidden world of many-particle dynamics. Recent multiple-ionization studies will be presented and the observation of correlated electron pairs will be discussed.
Women Physicists Speak: The 2001 International Study of Women in Physics
NASA Astrophysics Data System (ADS)
Ivie, Rachel; Czujko, Roman; Stowe, Katie
2002-09-01
The Working Group on Women in Physics of the International Union of Pure and Applied Physics (IUPAP) subcontracted with the Statistical Research Center of the American Institute of Physics (AIP) to conduct an international study on women in physics. This study had two parts. First, we conducted a benchmarking study to identify reliable sources and collect data on the representation of women in physics in as many IUPAP member countries as possible. Second, we conducted an international survey of individual women physicists. The survey addressed issues related to both education and employment. On the education side, we asked about experiences and critical incidents from secondary school through the highest degree earned. On the employment side, we asked about how the respondents' careers had evolved and their self-assessment of how well their careers had progressed. In addition, the questionnaire also addressed issues that cut across education and employment, such as the impact of marriage and children, the factors that contributed the most toward the success they had achieved to date, and suggestions for what could be done to improve the situation of women physicists.
High-energy neutron depth-dose distribution experiment.
Ferenci, M S; Hertel, N E
2003-01-01
A unique set of high-energy neutron depth-dose benchmark experiments were performed at the Los Alamos Neutron Science Center/Weapons Neutron Research (LANSCE/WNR) complex. The experiments consisted of filtered neutron beams with energies up to 800 MeV impinging on a 30 x 30 x 30 cm3 liquid, tissue-equivalent phantom. The absorbed dose was measured in the phantom at various depths with tissue-equivalent ion chambers. This experiment is intended to serve as a benchmark experiment for the testing of high-energy radiation transport codes for the international radiation protection community.
ERIC Educational Resources Information Center
Sampson, K. A.; Johnston, L.; Comer, K.; Brogt, E.
2016-01-01
Summative and benchmarking surveys to measure the postgraduate student research experience are well reported in the literature. While useful, we argue that local instruments that provide formative resources with an academic development focus are also required. If higher education institutions are to move beyond the identification of issues and…
Bertzbach, F; Franz, T; Möller, K
2012-01-01
This paper shows the results of performance improvement, which have been achieved in benchmarking projects in the wastewater industry in Germany over the last 15 years. A huge number of changes in operational practice and also in achieved annual savings can be shown, induced in particular by benchmarking at process level. Investigation of this question produces some general findings for the inclusion of performance improvement in a benchmarking project and for the communication of its results. Thus, we elaborate on the concept of benchmarking at both utility and process level, which is still a necessary distinction for the integration of performance improvement into our benchmarking approach. To achieve performance improvement via benchmarking it should be made quite clear that this outcome depends, on one hand, on a well conducted benchmarking programme and, on the other, on the individual situation within each participating utility.
NASA Astrophysics Data System (ADS)
Yu, Shi Jing; Fajeau, Emma; Liu, Lin Qiao; Jones, David J.; Madison, Kirk W.
2018-02-01
In this work, we address the advantages, limitations, and technical subtleties of employing field programmable gate array (FPGA)-based digital servos for high-bandwidth feedback control of lasers in atomic, molecular, and optical physics experiments. Specifically, we provide the results of benchmark performance tests in experimental setups including noise, bandwidth, and dynamic range for two digital servos built with low and mid-range priced FPGA development platforms. The digital servo results are compared to results obtained from a commercially available state-of-the-art analog servo using the same plant for control (intensity stabilization). The digital servos have feedback bandwidths of 2.5 MHz, limited by the total signal latency, and we demonstrate improvements beyond the transfer function offered by the analog servo including a three-pole filter and a two-pole filter with phase compensation to suppress resonances. We also discuss limitations of our FPGA-servo implementation and general considerations when designing and using digital servos.
Investigation of wing crack formation with a combined phase-field and experimental approach
NASA Astrophysics Data System (ADS)
Lee, Sanghyun; Reber, Jacqueline E.; Hayman, Nicholas W.; Wheeler, Mary F.
2016-08-01
Fractures that propagate off of weak slip planes are known as wing cracks and often play important roles in both tectonic deformation and fluid flow across reservoir seals. Previous numerical models have produced the basic kinematics of wing crack openings but generally have not been able to capture fracture geometries seen in nature. Here we present both a phase-field modeling approach and a physical experiment using gelatin for a wing crack formation. By treating the fracture surfaces as diffusive zones instead of as discontinuities, the phase-field model does not require consideration of unpredictable rock properties or stress inhomogeneities around crack tips. It is shown by benchmarking the models with physical experiments that the numerical assumptions in the phase-field approach do not affect the final model predictions of wing crack nucleation and growth. With this study, we demonstrate that it is feasible to implement the formation of wing cracks in large scale phase-field reservoir models.
Yu, Shi Jing; Fajeau, Emma; Liu, Lin Qiao; Jones, David J; Madison, Kirk W
2018-02-01
In this work, we address the advantages, limitations, and technical subtleties of employing field programmable gate array (FPGA)-based digital servos for high-bandwidth feedback control of lasers in atomic, molecular, and optical physics experiments. Specifically, we provide the results of benchmark performance tests in experimental setups including noise, bandwidth, and dynamic range for two digital servos built with low and mid-range priced FPGA development platforms. The digital servo results are compared to results obtained from a commercially available state-of-the-art analog servo using the same plant for control (intensity stabilization). The digital servos have feedback bandwidths of 2.5 MHz, limited by the total signal latency, and we demonstrate improvements beyond the transfer function offered by the analog servo including a three-pole filter and a two-pole filter with phase compensation to suppress resonances. We also discuss limitations of our FPGA-servo implementation and general considerations when designing and using digital servos.
BENCHMARK DOSE TECHNICAL GUIDANCE DOCUMENT ...
The purpose of this document is to provide guidance for the Agency on the application of the benchmark dose approach in determining the point of departure (POD) for health effects data, whether a linear or nonlinear low dose extrapolation is used. The guidance includes discussion on computation of benchmark doses and benchmark concentrations (BMDs and BMCs) and their lower confidence limits, data requirements, dose-response analysis, and reporting requirements. This guidance is based on today's knowledge and understanding, and on experience gained in using this approach.
Lystrup, Robert; West, Gordon F; Ward, Matthew; Hall, Jennifer; Stephens, Mark
2015-01-01
Military medical professionals play a central role in preventing and treating obesity among America's warriors through training, medical care, and their personal example. Unfortunately, medical students in both undergraduate and graduate settings often experience declines in physical fitness. Pedometry has been demonstrated as one means of promoting fitness with 10,000 steps/day generally accepted as a key benchmark. With this in mind, we used pedometry as an incentive during the preclinical years to encourage students to adopt a more active lifestyle. Findings suggest that participants that consistently report meeting the 10,000 steps/day maintained or improved their aerobic fitness. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marck, Steven C. van der, E-mail: vandermarck@nrg.eu
Recent releases of three major world nuclear reaction data libraries, ENDF/B-VII.1, JENDL-4.0, and JEFF-3.1.1, have been tested extensively using benchmark calculations. The calculations were performed with the latest release of the continuous energy Monte Carlo neutronics code MCNP, i.e. MCNP6. Three types of benchmarks were used, viz. criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 2000 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), tomore » mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for {sup 6}Li, {sup 7}Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D2O, H2O, concrete, polyethylene and teflon). The new functionality in MCNP6 to calculate the effective delayed neutron fraction was tested by comparison with more than thirty measurements in widely varying systems. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. The performance of the three libraries, in combination with MCNP6, is shown to be good. The results for the LEU-COMP-THERM category are on average very close to the benchmark value. Also for most other categories the results are satisfactory. Deviations from the benchmark values do occur in certain benchmark series, or in isolated cases within benchmark series. Such instances can often be related to nuclear data for specific non-fissile elements, such as C, Fe, or Gd. Indications are that the intermediate and mixed spectrum cases are less well described. The results for the shielding benchmarks are generally good, with very similar results for the three libraries in the majority of cases. Nevertheless there are, in certain cases, strong deviations between calculated and benchmark values, such as for Co and Mg. Also, the results show discrepancies at certain energies or angles for e.g. C, N, O, Mo, and W. The functionality of MCNP6 to calculate the effective delayed neutron fraction yields very good results for all three libraries.« less
Aerothermal modeling program. Phase 2, element B: Flow interaction experiment
NASA Technical Reports Server (NTRS)
Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.
1987-01-01
NASA has instituted an extensive effort to improve the design process and data base for the hot section components of gas turbine engines. The purpose of element B is to establish a benchmark quality data set that consists of measurements of the interaction of circular jets with swirling flow. Such flows are typical of those that occur in the primary zone of modern annular combustion liners. Extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current physical models used to predict such flows.
‘Wasteaware’ benchmark indicators for integrated sustainable waste management in cities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, David C., E-mail: waste@davidcwilson.com; Rodic, Ljiljana; Cowing, Michael J.
Highlights: • Solid waste management (SWM) is a key utility service, but data is often lacking. • Measuring their SWM performance helps a city establish priorities for action. • The Wasteaware benchmark indicators: measure both technical and governance aspects. • Have been developed over 5 years and tested in more than 50 cities on 6 continents. • Enable consistent comparison between cities and countries and monitoring progress. - Abstract: This paper addresses a major problem in international solid waste management, which is twofold: a lack of data, and a lack of consistent data to allow comparison between cities. The papermore » presents an indicator set for integrated sustainable waste management (ISWM) in cities both North and South, to allow benchmarking of a city’s performance, comparing cities and monitoring developments over time. It builds on pioneering work for UN-Habitat’s solid waste management in the World’s cities. The comprehensive analytical framework of a city’s solid waste management system is divided into two overlapping ‘triangles’ – one comprising the three physical components, i.e. collection, recycling, and disposal, and the other comprising three governance aspects, i.e. inclusivity; financial sustainability; and sound institutions and proactive policies. The indicator set includes essential quantitative indicators as well as qualitative composite indicators. This updated and revised ‘Wasteaware’ set of ISWM benchmark indicators is the cumulative result of testing various prototypes in more than 50 cities around the world. This experience confirms the utility of indicators in allowing comprehensive performance measurement and comparison of both ‘hard’ physical components and ‘soft’ governance aspects; and in prioritising ‘next steps’ in developing a city’s solid waste management system, by identifying both local strengths that can be built on and weak points to be addressed. The Wasteaware ISWM indicators are applicable to a broad range of cities with very different levels of income and solid waste management practices. Their wide application as a standard methodology will help to fill the historical data gap.« less
2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation
NASA Technical Reports Server (NTRS)
Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.
2009-01-01
A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.
The Isothermal Dendritic Growth Experiment Archive
NASA Astrophysics Data System (ADS)
Koss, Matthew
2009-03-01
The growth of dendrites is governed by the interplay between two simple and familiar processes---the irreversible diffusion of energy, and the reversible work done in the formation of new surface area. To advance our understanding of these processes, NASA sponsored a project that flew on the Space Shuttle Columbia is 1994, 1996, and 1997 to record and analyze benchmark data in an apparent-microgravity ``laboratory.'' In this laboratory, energy transfer by gravity driven convection was essentially eliminated and one could test independently, for the first time, both components of dendritic growth theory. The analysis of this data shows that although the diffusion of energy can be properly accounted for, the results from interfacial physics appear to be in disagreement and alternate models should receive increased attention. Unfortunately, currently and for the foreseeable future, there is no access or financial support to develop and conduct additional experiments of this type. However, the benchmark data of 35mm photonegatives, video, and all supporting instrument data are now available at the IDGE Archive at the College of the Holy Cross. This data may still have considerable relevance to researchers working specifically with dendritic growth, and more generally those working in the synthesis, growth & processing of materials, multiscale computational modeling, pattern formation, and systems far from equilibrium.
A chemical EOR benchmark study of different reservoir simulators
NASA Astrophysics Data System (ADS)
Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy
2016-09-01
Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.
Luecking, C T; Hennink-Kaminski, H; Ihekweazu, C; Vaughn, A; Mazzucca, S; Ward, D S
2017-12-01
Social marketing is a promising planning approach for influencing voluntary lifestyle behaviours, but its application to nutrition and physical activity interventions in the early care and education setting remains unknown. PubMed, ISI Web of Science, PsycInfo and the Cumulative Index of Nursing and Allied Health were systematically searched to identify interventions targeting nutrition and/or physical activity behaviours of children enrolled in early care centres between 1994 and 2016. Content analysis methods were used to capture information reflecting eight social marketing benchmark criteria. The review included 135 articles representing 77 interventions. Two interventions incorporated all eight benchmark criteria, but the majority included fewer than four. Each intervention included behaviour and methods mix criteria, and more than half identified audience segments. Only one-third of interventions incorporated customer orientation, theory, exchange and insight. Only six interventions addressed competing behaviours. We did not find statistical significance for the effectiveness of interventions on child-level diet, physical activity or anthropometric outcomes based on the number of benchmark criteria used. This review highlights opportunities to apply social marketing to obesity prevention interventions in early care centres. Social marketing could be an important strategy for early childhood obesity prevention efforts, and future research investigations into its effects are warranted. © 2017 World Obesity Federation.
Benchmarking gate-based quantum computers
NASA Astrophysics Data System (ADS)
Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans
2017-11-01
With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.
Aguilar Cordero, M J; Sánchez López, A M; Rodríguez Blanque, R; Noack Segovia, J P; Pozo Cano, M D; López-Contreras, G; Mur Villar, N
2014-10-01
Regular physical activity is known to be very beneficial to health. While it is important at all stages of life, during pregnancy doubts may arise about the suitability of physical exercise, as well as the type of activity, its frequency, intensity and duration. To analyse major studies on the influence of physical activity on maternal and foetal parameters. Systematic review of physical activity programmes for pregnant women and the results achieved, during pregnancy, childbirth and postpartum. 45 items were identified through an automated database search in PubMed, Scopus and Google Scholar, carried out from October 2013 to March 2014. In selecting the items, the criteria applied included the usefulness and relevance of the subject matter and the credibility or experience of the research study authors. The internal and external validity of each of the articles reviewed was taken into account. The results of the review highlight the importance of physical activity during pregnancy, and show that the information currently available can serve as an initial benchmark for further investigation into the impact of regular physical exercise, in an aquatic environment, on maternal-foetal health. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Zhi-Guo; Chen, Qi-Feng; Gu, Yun-Jun; Zheng, Jun; Chen, Xiang-Rong
2016-10-01
The accurate hydrodynamic description of an event or system that addresses the equations of state, phase transitions, dissociations, ionizations, and compressions, determines how materials respond to a wide range of physical environments. To understand dense matter behavior in extreme conditions requires the continual development of diagnostic methods for accurate measurements of the physical parameters. Here, we present a comprehensive diagnostic technique that comprises optical pyrometry, velocity interferometry, and time-resolved spectroscopy. This technique was applied to shock compression experiments of dense gaseous deuterium-helium mixtures driven via a two-stage light gas gun. The advantage of this approach lies in providing measurements of multiple physical parameters in a single experiment, such as light radiation histories, particle velocity profiles, and time-resolved spectra, which enables simultaneous measurements of shock velocity, particle velocity, pressure, density, and temperature and expands understanding of dense high pressure shock situations. The combination of multiple diagnostics also allows different experimental observables to be measured and cross-checked. Additionally, it implements an accurate measurement of the principal Hugoniots of deuterium-helium mixtures, which provides a benchmark for the impedance matching measurement technique.
Grid-Enabled High Energy Physics Research using a Beowulf Cluster
NASA Astrophysics Data System (ADS)
Mahmood, Akhtar
2005-04-01
At Edinboro University of Pennsylvania, we have built a 8-node 25 Gflops Beowulf Cluster with 2.5 TB of disk storage space to carry out grid-enabled, data-intensive high energy physics research for the ATLAS experiment via Grid3. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes. Once fully functional, the Cluster will be part of Grid3[www.ivdgl.org/grid3]. The current ATLAS simulation grid application, models the entire physical processes from the proton anti-proton collisions and detector's response to the collision debri through the complete reconstruction of the event from analyses of these responses. The end result is a detailed set of data that simulates the real physical collision event inside a particle detector. Grid is the new IT infrastructure for the 21^st century science -- a new computing paradigm that is poised to transform the practice of large-scale data-intensive research in science and engineering. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.
The PAC-MAN model: Benchmark case for linear acoustics in computational physics
NASA Astrophysics Data System (ADS)
Ziegelwanger, Harald; Reiter, Paul
2017-10-01
Benchmark cases in the field of computational physics, on the one hand, have to contain a certain complexity to test numerical edge cases and, on the other hand, require the existence of an analytical solution, because an analytical solution allows the exact quantification of the accuracy of a numerical simulation method. This dilemma causes a need for analytical sound field formulations of complex acoustic problems. A well known example for such a benchmark case for harmonic linear acoustics is the ;Cat's Eye model;, which describes the three-dimensional sound field radiated from a sphere with a missing octant analytically. In this paper, a benchmark case for two-dimensional (2D) harmonic linear acoustic problems, viz., the ;PAC-MAN model;, is proposed. The PAC-MAN model describes the radiated and scattered sound field around an infinitely long cylinder with a cut out sector of variable angular width. While the analytical calculation of the 2D sound field allows different angular cut-out widths and arbitrarily positioned line sources, the computational cost associated with the solution of this problem is similar to a 1D problem because of a modal formulation of the sound field in the PAC-MAN model.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Ye; Ma, Xiaosong; Liu, Qing Gary
2015-01-01
Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less
New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)
NASA Astrophysics Data System (ADS)
Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.
2017-09-01
Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.
Ultracool dwarf benchmarks with Gaia primaries
NASA Astrophysics Data System (ADS)
Marocco, F.; Pinfield, D. J.; Cook, N. J.; Zapatero Osorio, M. R.; Montes, D.; Caballero, J. A.; Gálvez-Ortiz, M. C.; Gromadzki, M.; Jones, H. R. A.; Kurtev, R.; Smart, R. L.; Zhang, Z.; Cabrera Lavers, A. L.; García Álvarez, D.; Qi, Z. X.; Rickard, M. J.; Dover, L.
2017-10-01
We explore the potential of Gaia for the field of benchmark ultracool/brown dwarf companions, and present the results of an initial search for metal-rich/metal-poor systems. A simulated population of resolved ultracool dwarf companions to Gaia primary stars is generated and assessed. Of the order of ˜24 000 companions should be identifiable outside of the Galactic plane (|b| > 10 deg) with large-scale ground- and space-based surveys including late M, L, T and Y types. Our simulated companion parameter space covers 0.02 ≤ M/M⊙ ≤ 0.1, 0.1 ≤ age/Gyr ≤ 14 and -2.5 ≤ [Fe/H] ≤ 0.5, with systems required to have a false alarm probability <10-4, based on projected separation and expected constraints on common distance, common proper motion and/or common radial velocity. Within this bulk population, we identify smaller target subsets of rarer systems whose collective properties still span the full parameter space of the population, as well as systems containing primary stars that are good age calibrators. Our simulation analysis leads to a series of recommendations for candidate selection and observational follow-up that could identify ˜500 diverse Gaia benchmarks. As a test of the veracity of our methodology and simulations, our initial search uses UKIRT Infrared Deep Sky Survey and Sloan Digital Sky Survey to select secondaries, with the parameters of primaries taken from Tycho-2, Radial Velocity Experiment, Large sky Area Multi-Object fibre Spectroscopic Telescope and Tycho-Gaia Astrometric Solution. We identify and follow up 13 new benchmarks. These include M8-L2 companions, with metallicity constraints ranging in quality, but robust in the range -0.39 ≤ [Fe/H] ≤ +0.36, and with projected physical separation in the range 0.6 < s/kau < 76. Going forward, Gaia offers a very high yield of benchmark systems, from which diverse subsamples may be able to calibrate a range of foundational ultracool/sub-stellar theory and observation.
Experimental power density distribution benchmark in the TRIGA Mark II reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snoj, L.; Stancar, Z.; Radulovic, V.
2012-07-01
In order to improve the power calibration process and to benchmark the existing computational model of the TRIGA Mark II reactor at the Josef Stefan Inst. (JSI), a bilateral project was started as part of the agreement between the French Commissariat a l'energie atomique et aux energies alternatives (CEA) and the Ministry of higher education, science and technology of Slovenia. One of the objectives of the project was to analyze and improve the power calibration process of the JSI TRIGA reactor (procedural improvement and uncertainty reduction) by using absolutely calibrated CEA fission chambers (FCs). This is one of the fewmore » available power density distribution benchmarks for testing not only the fission rate distribution but also the absolute values of the fission rates. Our preliminary calculations indicate that the total experimental uncertainty of the measured reaction rate is sufficiently low that the experiments could be considered as benchmark experiments. (authors)« less
Fast Neutron Spectrum Potassium Worth for Space Power Reactor Design Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Marshall, Margaret A.; Briggs, J. Blair
2015-03-01
A variety of critical experiments were constructed of enriched uranium metal (oralloy ) during the 1960s and 1970s at the Oak Ridge Critical Experiments Facility (ORCEF) in support of criticality safety operations at the Y-12 Plant. The purposes of these experiments included the evaluation of storage, casting, and handling limits for the Y-12 Plant and providing data for verification of calculation methods and cross-sections for nuclear criticality safety applications. These included solid cylinders of various diameters, annuli of various inner and outer diameters, two and three interacting cylinders of various diameters, and graphite and polyethylene reflected cylinders and annuli. Ofmore » the hundreds of delayed critical experiments, one was performed that consisted of uranium metal annuli surrounding a potassium-filled, stainless steel can. The outer diameter of the annuli was approximately 13 inches (33.02 cm) with an inner diameter of 7 inches (17.78 cm). The diameter of the stainless steel can was 7 inches (17.78 cm). The critical height of the configurations was approximately 5.6 inches (14.224 cm). The uranium annulus consisted of multiple stacked rings, each with radial thicknesses of 1 inch (2.54 cm) and varying heights. A companion measurement was performed using empty stainless steel cans; the primary purpose of these experiments was to test the fast neutron cross sections of potassium as it was a candidate for coolant in some early space power reactor designs.The experimental measurements were performed on July 11, 1963, by J. T. Mihalczo and M. S. Wyatt (Ref. 1) with additional information in its corresponding logbook. Unreflected and unmoderated experiments with the same set of highly enriched uranium metal parts were performed at the Oak Ridge Critical Experiments Facility in the 1960s and are evaluated in the International Handbook for Evaluated Criticality Safety Benchmark Experiments (ICSBEP Handbook) with the identifier HEU MET FAST 051. Thin graphite reflected (2 inches or less) experiments also using the same set of highly enriched uranium metal parts are evaluated in HEU MET FAST 071. Polyethylene-reflected configurations are evaluated in HEU-MET-FAST-076. A stack of highly enriched metal discs with a thick beryllium top reflector is evaluated in HEU-MET-FAST-069, and two additional highly enriched uranium annuli with beryllium cores are evaluated in HEU-MET-FAST-059. Both detailed and simplified model specifications are provided in this evaluation. Both of these fast neutron spectra assemblies were determined to be acceptable benchmark experiments. The calculated eigenvalues for both the detailed and the simple benchmark models are within ~0.26 % of the benchmark values for Configuration 1 (calculations performed using MCNP6 with ENDF/B-VII.1 neutron cross section data), but under-calculate the benchmark values by ~7s because the uncertainty in the benchmark is very small: ~0.0004 (1s); for Configuration 2, the under-calculation is ~0.31 % and ~8s. Comparison of detailed and simple model calculations for the potassium worth measurement and potassium mass coefficient yield results approximately 70 – 80 % lower (~6s to 10s) than the benchmark values for the various nuclear data libraries utilized. Both the potassium worth and mass coefficient are also deemed to be acceptable benchmark experiment measurements.« less
NASA Technical Reports Server (NTRS)
Stefanescu, D. M.; Catalina, A. V.; Juretzko, Frank R.; Sen, Subhayu; Curreri, P. A.
2003-01-01
The objective of the work on Particle Engulfment and Pushing by Solidifying Interfaces (PEP) include: 1) to obtain fundamental understanding of the physics of particle pushing and engulfment, 2) to develop mathematical models to describe the phenomenon, and 3) to perform critical experiments in the microgravity environment of space to provide benchmark data for model validation. Successful completion of this project will yield vital information relevant to a diverse area of terrestrial applications. With PEP being a long term research effort, this report will focus on advances in the theoretical treatment of the solid/liquid interface interaction with an approaching particle, experimental validation of some aspects of the developed models, and the experimental design aspects of future experiments to be performed on board the International Space Station.
A Monte Carlo software for the 1-dimensional simulation of IBIC experiments
NASA Astrophysics Data System (ADS)
Forneris, J.; Jakšić, M.; Pastuović, Ž.; Vittone, E.
2014-08-01
The ion beam induced charge (IBIC) microscopy is a valuable tool for the analysis of the electronic properties of semiconductors. In this work, a recently developed Monte Carlo approach for the simulation of IBIC experiments is presented along with a self-standing software equipped with graphical user interface. The method is based on the probabilistic interpretation of the excess charge carrier continuity equations and it offers to the end-user the full control not only of the physical properties ruling the induced charge formation mechanism (i.e., mobility, lifetime, electrostatics, device's geometry), but also of the relevant experimental conditions (ionization profiles, beam dispersion, electronic noise) affecting the measurement of the IBIC pulses. Moreover, the software implements a novel model for the quantitative evaluation of the radiation damage effects on the charge collection efficiency degradation of ion-beam-irradiated devices. The reliability of the model implementation is then validated against a benchmark IBIC experiment.
Adiabatic model and design of a translating field reversed configuration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intrator, T. P.; Siemon, R. E.; Sieck, P. E.
We apply an adiabatic evolution model to predict the behavior of a field reversed configuration (FRC) during decompression and translation, as well as during boundary compression. Semi-empirical scaling laws, which were developed and benchmarked primarily for collisionless FRCs, are expected to remain valid even for the collisional regime of FRX-L experiment. We use this approach to outline the design implications for FRX-L, the high density translated FRC experiment at Los Alamos National Laboratory. A conical theta coil is used to accelerate the FRC to the largest practical velocity so it can enter a mirror bounded compression region, where it mustmore » be a suitable target for a magnetized target fusion (MTF) implosion. FRX-L provides the physics basis for the integrated MTF plasma compression experiment at the Shiva-Star pulsed power facility at Kirtland Air Force Research Laboratory, where the FRC will be compressed inside a flux conserving cylindrical shell.« less
Tablet Technology to Monitor Physical Education IEP Goals and Benchmarks
ERIC Educational Resources Information Center
Lavay, Barry; Sakai, Joyce; Ortiz, Cris; Roth, Kristi
2015-01-01
The Individual with Disabilities Education Act (IDEA) mandates that all children who are eligible for special education services receive an individualized education program (IEP). Adapted physical education (APE) professionals who teach physical education to children with disabilities are challenged with how to best collect and monitor student…
Experimental and theoretical studies of nanofluid thermal conductivity enhancement: a review
2011-01-01
Nanofluids, i.e., well-dispersed (metallic) nanoparticles at low- volume fractions in liquids, may enhance the mixture's thermal conductivity, knf, over the base-fluid values. Thus, they are potentially useful for advanced cooling of micro-systems. Focusing mainly on dilute suspensions of well-dispersed spherical nanoparticles in water or ethylene glycol, recent experimental observations, associated measurement techniques, and new theories as well as useful correlations have been reviewed. It is evident that key questions still linger concerning the best nanoparticle-and-liquid pairing and conditioning, reliable measurements of achievable knf values, and easy-to-use, physically sound computer models which fully describe the particle dynamics and heat transfer of nanofluids. At present, experimental data and measurement methods are lacking consistency. In fact, debates on whether the anomalous enhancement is real or not endure, as well as discussions on what are repeatable correlations between knf and temperature, nanoparticle size/shape, and aggregation state. Clearly, benchmark experiments are needed, using the same nanofluids subject to different measurement methods. Such outcomes would validate new, minimally intrusive techniques and verify the reproducibility of experimental results. Dynamic knf models, assuming non-interacting metallic nano-spheres, postulate an enhancement above the classical Maxwell theory and thereby provide potentially additional physical insight. Clearly, it will be necessary to consider not only one possible mechanism but combine several mechanisms and compare predictive results to new benchmark experimental data sets. PMID:21711739
Revisiting Yasinsky and Henry`s benchmark using modern nodal codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feltus, M.A.; Becker, M.W.
1995-12-31
The numerical experiments analyzed by Yasinsky and Henry are quite trivial by comparison with today`s standards because they used the finite difference code WIGLE for their benchmark. Also, this problem is a simple slab (one-dimensional) case with no feedback mechanisms. This research attempts to obtain STAR (Ref. 2) and NEM (Ref. 3) code results in order to produce a more modern kinetics benchmark with results comparable WIGLE.
Develop applications based on android: Teacher Engagement Control of Health (TECH)
NASA Astrophysics Data System (ADS)
Sasmoko; Manalu, S. R.; Widhoyoko, S. A.; Indrianti, Y.; Suparto
2018-03-01
Physical and psychological condition of teachers is very important because it helped determine the realization of a positive school climate and productive so that they can run their profession optimally. This research is an advanced research on the design of ITEI application that able to see the profile of teacher’s engagement in Indonesia and to optimize the condition is needed an application that can detect the health of teachers both physically and psychologically. The research method used is the neuroresearch method combined with the development of IT system design for TECH which includes server design, database and android TECH application display. The study yielded 1) mental health benchmarks, 2) physical health benchmarks, and 3) the design of Android Application for Teacher Engagement Control of Health (TECH).
Deterministic Modeling of the High Temperature Test Reactor with DRAGON-HEXPEDITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Ortensi; M.A. Pope; R.M. Ferrer
2010-10-01
The Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine the INL’s current prismatic reactor analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 fuel column thin annular core, and the fully loaded core critical condition with 30 fuel columns. Special emphasis is devoted to physical phenomena and artifacts in HTTR that are similar to phenomena and artifacts in themore » NGNP base design. The DRAGON code is used in this study since it offers significant ease and versatility in modeling prismatic designs. DRAGON can generate transport solutions via Collision Probability (CP), Method of Characteristics (MOC) and Discrete Ordinates (Sn). A fine group cross-section library based on the SHEM 281 energy structure is used in the DRAGON calculations. The results from this study show reasonable agreement in the calculation of the core multiplication factor with the MC methods, but a consistent bias of 2–3% with the experimental values is obtained. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement partially stems from the fact that during the experiments the control rods were adjusted to maintain criticality, whereas in the model, the rod positions were fixed. In addition, this work includes a brief study of a cross section generation approach that seeks to decouple the domain in order to account for neighbor effects. This spectral interpenetration is a dominant effect in annular HTR physics. This analysis methodology should be further explored in order to reduce the error that is systematically propagated in the traditional generation of cross sections.« less
A spectral, quasi-cylindrical and dispersion-free Particle-In-Cell algorithm
Lehe, Remi; Kirchen, Manuel; Andriyash, Igor A.; ...
2016-02-17
We propose a spectral Particle-In-Cell (PIC) algorithm that is based on the combination of a Hankel transform and a Fourier transform. For physical problems that have close-to-cylindrical symmetry, this algorithm can be much faster than full 3D PIC algorithms. In addition, unlike standard finite-difference PIC codes, the proposed algorithm is free of spurious numerical dispersion, in vacuum. This algorithm is benchmarked in several situations that are of interest for laser-plasma interactions. These benchmarks show that it avoids a number of numerical artifacts, that would otherwise affect the physics in a standard PIC algorithm - including the zero-order numerical Cherenkov effect.
Validation of the WIMSD4M cross-section generation code with benchmark results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deen, J.R.; Woodruff, W.L.; Leal, L.E.
1995-01-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section librariesmore » for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less
Validation of the WIMSD4M cross-section generation code with benchmark results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leal, L.C.; Deen, J.R.; Woodruff, W.L.
1995-02-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment for Research and Test (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the procedure to generatemore » cross-section libraries for reactor analyses and calculations utilizing the WIMSD4M code. To do so, the results of calculations performed with group cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory(ORNL) unreflected critical spheres, the TRX critical experiments, and calculations of a modified Los Alamos highly-enriched heavy-water moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.« less
Use of integral experiments in support to the validation of JEFF-3.2 nuclear data evaluation
NASA Astrophysics Data System (ADS)
Leclaire, Nicolas; Cochet, Bertrand; Jinaphanh, Alexis; Haeck, Wim
2017-09-01
For many years now, IRSN has developed its own Monte Carlo continuous energy capability, which allows testing various nuclear data libraries. In that prospect, a validation database of 1136 experiments was built from cases used for the validation of the APOLLO2-MORET 5 multigroup route of the CRISTAL V2.0 package. In this paper, the keff obtained for more than 200 benchmarks using the JEFF-3.1.1 and JEFF-3.2 libraries are compared to benchmark keff values and main discrepancies are analyzed regarding the neutron spectrum. Special attention is paid on benchmarks for which the results have been highly modified between both JEFF-3 versions.
Validating vignette and conjoint survey experiments against real-world behavior
Hainmueller, Jens; Hangartner, Dominik; Yamamoto, Teppei
2015-01-01
Survey experiments, like vignette and conjoint analyses, are widely used in the social sciences to elicit stated preferences and study how humans make multidimensional choices. However, there is a paucity of research on the external validity of these methods that examines whether the determinants that explain hypothetical choices made by survey respondents match the determinants that explain what subjects actually do when making similar choices in real-world situations. This study compares results from conjoint and vignette analyses on which immigrant attributes generate support for naturalization with closely corresponding behavioral data from a natural experiment in Switzerland, where some municipalities used referendums to decide on the citizenship applications of foreign residents. Using a representative sample from the same population and the official descriptions of applicant characteristics that voters received before each referendum as a behavioral benchmark, we find that the effects of the applicant attributes estimated from the survey experiments perform remarkably well in recovering the effects of the same attributes in the behavioral benchmark. We also find important differences in the relative performances of the different designs. Overall, the paired conjoint design, where respondents evaluate two immigrants side by side, comes closest to the behavioral benchmark; on average, its estimates are within 2% percentage points of the effects in the behavioral benchmark. PMID:25646415
Quantifying Information Gain from Dynamic Downscaling Experiments
NASA Astrophysics Data System (ADS)
Tian, Y.; Peters-Lidard, C. D.
2015-12-01
Dynamic climate downscaling experiments are designed to produce information at higher spatial and temporal resolutions. Such additional information is generated from the low-resolution initial and boundary conditions via the predictive power of the physical laws. However, errors and uncertainties in the initial and boundary conditions can be propagated and even amplified to the downscaled simulations. Additionally, the limit of predictability in nonlinear dynamical systems will also damper the information gain, even if the initial and boundary conditions were error-free. Thus it is critical to quantitatively define and measure the amount of information increase from dynamic downscaling experiments, to better understand and appreciate their potentials and limitations. We present a scheme to objectively measure the information gain from such experiments. The scheme is based on information theory, and we argue that if a downscaling experiment is to exhibit value, it has to produce more information than what can be simply inferred from information sources already available. These information sources include the initial and boundary conditions, the coarse resolution model in which the higher-resolution models are embedded, and the same set of physical laws. These existing information sources define an "information threshold" as a function of the spatial and temporal resolution, and this threshold serves as a benchmark to quantify the information gain from the downscaling experiments, or any other approaches. For a downscaling experiment to shown any value, the information has to be above this threshold. A recent NASA-supported downscaling experiment is used as an example to illustrate the application of this scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.
1994-01-01
This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure 1.4.12.2.3.04.07.02 (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.
International land Model Benchmarking (ILAMB) Package v002.00
Collier, Nathaniel [Oak Ridge National Laboratory; Hoffman, Forrest M. [Oak Ridge National Laboratory; Mu, Mingquan [University of California, Irvine; Randerson, James T. [University of California, Irvine; Riley, William J. [Lawrence Berkeley National Laboratory
2016-05-09
As a contribution to International Land Model Benchmarking (ILAMB) Project, we are providing new analysis approaches, benchmarking tools, and science leadership. The goal of ILAMB is to assess and improve the performance of land models through international cooperation and to inform the design of new measurement campaigns and field studies to reduce uncertainties associated with key biogeochemical processes and feedbacks. ILAMB is expected to be a primary analysis tool for CMIP6 and future model-data intercomparison experiments. This team has developed initial prototype benchmarking systems for ILAMB, which will be improved and extended to include ocean model metrics and diagnostics.
International land Model Benchmarking (ILAMB) Package v001.00
Mu, Mingquan [University of California, Irvine; Randerson, James T. [University of California, Irvine; Riley, William J. [Lawrence Berkeley National Laboratory; Hoffman, Forrest M. [Oak Ridge National Laboratory
2016-05-02
As a contribution to International Land Model Benchmarking (ILAMB) Project, we are providing new analysis approaches, benchmarking tools, and science leadership. The goal of ILAMB is to assess and improve the performance of land models through international cooperation and to inform the design of new measurement campaigns and field studies to reduce uncertainties associated with key biogeochemical processes and feedbacks. ILAMB is expected to be a primary analysis tool for CMIP6 and future model-data intercomparison experiments. This team has developed initial prototype benchmarking systems for ILAMB, which will be improved and extended to include ocean model metrics and diagnostics.
Performance of Landslide-HySEA tsunami model for NTHMP benchmarking validation process
NASA Astrophysics Data System (ADS)
Macias, Jorge
2017-04-01
In its FY2009 Strategic Plan, the NTHMP required that all numerical tsunami inundation models be verified as accurate and consistent through a model benchmarking process. This was completed in 2011, but only for seismic tsunami sources and in a limited manner for idealized solid underwater landslides. Recent work by various NTHMP states, however, has shown that landslide tsunami hazard may be dominant along significant parts of the US coastline, as compared to hazards from other tsunamigenic sources. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory date sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. The Landslide-HySEA model has participated in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017. The aim of this presentation is to show some of the numerical results obtained for Landslide-HySEA in the framework of this benchmarking validation/verification effort. Acknowledgements. This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069), the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and Universidad de Málaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).
Highly Enriched Uranium Metal Cylinders Surrounded by Various Reflector Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard Jones; J. Blair Briggs; Leland Monteirth
A series of experiments was performed at Los Alamos Scientific Laboratory in 1958 to determine critical masses of cylinders of Oralloy (Oy) reflected by a number of materials. The experiments were all performed on the Comet Universal Critical Assembly Machine, and consisted of discs of highly enriched uranium (93.3 wt.% 235U) reflected by half-inch and one-inch-thick cylindrical shells of various reflector materials. The experiments were performed by members of Group N-2, particularly K. W. Gallup, G. E. Hansen, H. C. Paxton, and R. H. White. This experiment was intended to ascertain critical masses for criticality safety purposes, as well asmore » to compare neutron transport cross sections to those obtained from danger coefficient measurements with the Topsy Oralloy-Tuballoy reflected and Godiva unreflected critical assemblies. The reflector materials examined in this series of experiments are as follows: magnesium, titanium, aluminum, graphite, mild steel, nickel, copper, cobalt, molybdenum, natural uranium, tungsten, beryllium, aluminum oxide, molybdenum carbide, and polythene (polyethylene). Also included are two special configurations of composite beryllium and iron reflectors. Analyses were performed in which uncertainty associated with six different parameters was evaluated; namely, extrapolation to the uranium critical mass, uranium density, 235U enrichment, reflector density, reflector thickness, and reflector impurities. In addition to the idealizations made by the experimenters (removal of the platen and diaphragm), two simplifications were also made to the benchmark models that resulted in a small bias and additional uncertainty. First of all, since impurities in core and reflector materials are only estimated, they are not included in the benchmark models. Secondly, the room, support structure, and other possible surrounding equipment were not included in the model. Bias values that result from these two simplifications were determined and associated uncertainty in the bias values were included in the overall uncertainty in benchmark keff values. Bias values were very small, ranging from 0.0004 ?k low to 0.0007 ?k low. Overall uncertainties range from ? 0.0018 to ? 0.0030. Major contributors to the overall uncertainty include uncertainty in the extrapolation to the uranium critical mass and the uranium density. Results are summarized in Figure 1. Figure 1. Experimental, Benchmark-Model, and MCNP/KENO Calculated Results The 32 configurations described and evaluated under ICSBEP Identifier HEU-MET-FAST-084 are judged to be acceptable for use as criticality safety benchmark experiments and should be valuable integral benchmarks for nuclear data testing of the various reflector materials. Details of the benchmark models, uncertainty analyses, and final results are given in this paper.« less
Transition Studies on a Swept-Wing Model
NASA Technical Reports Server (NTRS)
Saric, William S.
1996-01-01
The present investigation contributes to the understanding of boundary-layer stability and transition by providing detailed measurements of carefully-produced stationary crossflow vortices. It is clear that a successful prediction of transition in swept-wing flows must include an understanding of the detailed physics involved. Receptivity and nonlinear effects must not be ignored. Linear stability theory correctly predicts the expected wavelengths and mode shapes for stationary crossflow, but fails to predict the growth rates, even for low amplitudes. As new computational and analytical methods are developed to deal with three-dimensional boundary layers, the data provided by this experiment will serve as a useful benchmark for comparison.
Modeling of material erosion and redeposition for dedicated DiMES experiments on DIII-D
NASA Astrophysics Data System (ADS)
Ding, R.; Abrams, T.; Chrobak, C. P.; Guo, H. Y.; Snyder, P. B.; Chan, V. S.; Rudakov, D. L.; Stangeby, P. C.; Elder, J. D.; Tskhakaya, D.; Wampler, W. R.; Kirschner, A.; McLean, A. G.
2015-11-01
Erosion and redeposition of plasma facing materials is a key issue for high-power, long pulse tokamak operation. A series of experiments has been carried out on DIII-D in which well-characterized samples of different materials were exposed to divertor plasma using DiMES. Such experiments provide a good benchmark for PMI codes, such as ERO. It was found that the erosion and redeposition are strongly determined by the impurity content in the plasma and sheath properties near the surface. The principal experimental results (net erosion rate and profile, net/gross erosion ratio) are reproduced by ERO simulations to within the uncertainties, indicating that the controlling physics has likely been identified. New techniques suggested by modeling such as external biasing and local gas injection for suppressing material erosion are planned to be tested in DiMES/DIII-D experiments. Work supported by US DOE DE-FC02-04ER54698, DE-AC52-07NA27344, DE-AC04-94AL85000, DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1994-09-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W. II
1993-01-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
NASA Technical Reports Server (NTRS)
Krause, David L.; Brewer, Ethan J.; Pawlik, Ralph
2013-01-01
This report provides test methodology details and qualitative results for the first structural benchmark creep test of an Advanced Stirling Convertor (ASC) heater head of ASC-E2 design heritage. The test article was recovered from a flight-like Microcast MarM-247 heater head specimen previously used in helium permeability testing. The test article was utilized for benchmark creep test rig preparation, wall thickness and diametral laser scan hardware metrological developments, and induction heater custom coil experiments. In addition, a benchmark creep test was performed, terminated after one week when through-thickness cracks propagated at thermocouple weld locations. Following this, it was used to develop a unique temperature measurement methodology using contact thermocouples, thereby enabling future benchmark testing to be performed without the use of conventional welded thermocouples, proven problematic for the alloy. This report includes an overview of heater head structural benchmark creep testing, the origin of this particular test article, test configuration developments accomplished using the test article, creep predictions for its benchmark creep test, qualitative structural benchmark creep test results, and a short summary.
A generic framework for individual-based modelling and physical-biological interaction
2018-01-01
The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280
A comparison of five benchmarks
NASA Technical Reports Server (NTRS)
Huss, Janice E.; Pennline, James A.
1987-01-01
Five benchmark programs were obtained and run on the NASA Lewis CRAY X-MP/24. A comparison was made between the programs codes and between the methods for calculating performance figures. Several multitasking jobs were run to gain experience in how parallel performance is measured.
Physics of neutral gas jet interaction with magnetized plasmas
NASA Astrophysics Data System (ADS)
Wang, Zhanhui; Xu, Xueqiao; Diamond, Patrick; Xu, Min; Duan, Xuru; Yu, Deliang; Zhou, Yulin; Shi, Yongfu; Nie, Lin; Ke, Rui; Zhong, Wulv; Shi, Zhongbing; Sun, Aiping; Li, Jiquan; Yao, Lianghua
2017-10-01
It is critical to understand the physics and transport dynamics during the plasma fuelling process. Plasma and neutral interactions involve the transfer of charge, momentum, and energy in ion-neutral and electron-neutral collisions. Thus, a seven field fluid model of neutral gas jet injection (NGJI) is obtained, which couples plasma density, heat, and momentum transport equations together with neutrals density and momentum transport equations of both molecules and atoms. Transport dynamics of plasma and neutrals are simulated for a complete range of discharge times, including steady state before NGJI, transport during NGJI, and relaxation after NGJI. With the trans-neut module of BOUT + + code, the simulations of mean profile variations and fueling depths during fueling have been benchmarked well with other codes and also validated with HL-2A experiment results. Both fast component (FC) and slow component (SC) of NGJI are simulated and validated with the HL-2A experimental measurements. The plasma blocking effect on the FC penetration is also simulated and validated well with the experiment. This work is supported by the National Natural Science Foundation of China under Grant No. 11575055.
Study of hypervelocity projectile impact on thick metal plates
Roy, Shawoon K.; Trabia, Mohamed; O’Toole, Brendan; ...
2016-01-01
Hypervelocity impacts generate extreme pressure and shock waves in impacted targets that undergo severe localized deformation within a few microseconds. These impact experiments pose unique challenges in terms of obtaining accurate measurements. Similarly, simulating these experiments is not straightforward. This paper proposed an approach to experimentally measure the velocity of the back surface of an A36 steel plate impacted by a projectile. All experiments used a combination of a two-stage light-gas gun and the photonic Doppler velocimetry (PDV) technique. The experimental data were used to benchmark and verify computational studies. Two different finite-element methods were used to simulate the experiments:more » Lagrangian-based smooth particle hydrodynamics (SPH) and Eulerian-based hydrocode. Both codes used the Johnson-Cook material model and the Mie-Grüneisen equation of state. Experiments and simulations were compared based on the physical damage area and the back surface velocity. Finally, the results of this study showed that the proposed simulation approaches could be used to reduce the need for expensive experiments.« less
A new numerical benchmark of a freshwater lens
NASA Astrophysics Data System (ADS)
Stoeckl, L.; Walther, M.; Graf, T.
2016-04-01
A numerical benchmark for 2-D variable-density flow and solute transport in a freshwater lens is presented. The benchmark is based on results of laboratory experiments conducted by Stoeckl and Houben (2012) using a sand tank on the meter scale. This benchmark describes the formation and degradation of a freshwater lens over time as it can be found under real-world islands. An error analysis gave the appropriate spatial and temporal discretization of 1 mm and 8.64 s, respectively. The calibrated parameter set was obtained using the parameter estimation tool PEST. Comparing density-coupled and density-uncoupled results showed that the freshwater-saltwater interface position is strongly dependent on density differences. A benchmark that adequately represents saltwater intrusion and that includes realistic features of coastal aquifers or freshwater lenses was lacking. This new benchmark was thus developed and is demonstrated to be suitable to test variable-density groundwater models applied to saltwater intrusion investigations.
Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides
Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.
2016-01-01
Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics of sediment, and uncertainty in TEB values. Additional evaluations of benchmarks in relation to sediment chemistry and toxicity are ongoing.
Characterization of addressability by simultaneous randomized benchmarking.
Gambetta, Jay M; Córcoles, A D; Merkel, S T; Johnson, B R; Smolin, John A; Chow, Jerry M; Ryan, Colm A; Rigetti, Chad; Poletto, S; Ohki, Thomas A; Ketchen, Mark B; Steffen, M
2012-12-14
The control and handling of errors arising from cross talk and unwanted interactions in multiqubit systems is an important issue in quantum information processing architectures. We introduce a benchmarking protocol that provides information about the amount of addressability present in the system and implement it on coupled superconducting qubits. The protocol consists of randomized benchmarking experiments run both individually and simultaneously on pairs of qubits. A relevant figure of merit for the addressability is then related to the differences in the measured average gate fidelities in the two experiments. We present results from two similar samples with differing cross talk and unwanted qubit-qubit interactions. The results agree with predictions based on simple models of the classical cross talk and Stark shifts.
Parallelization of NAS Benchmarks for Shared Memory Multiprocessors
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)
1998-01-01
This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thrower, A.W.; Patric, J.; Keister, M.
2008-07-01
The purpose of the Office of Civilian Radioactive Waste Management's (OCRWM) Logistics Benchmarking Project is to identify established government and industry practices for the safe transportation of hazardous materials which can serve as a yardstick for design and operation of OCRWM's national transportation system for shipping spent nuclear fuel and high-level radioactive waste to the proposed repository at Yucca Mountain, Nevada. The project will present logistics and transportation practices and develop implementation recommendations for adaptation by the national transportation system. This paper will describe the process used to perform the initial benchmarking study, highlight interim findings, and explain how thesemore » findings are being implemented. It will also provide an overview of the next phase of benchmarking studies. The benchmarking effort will remain a high-priority activity throughout the planning and operational phases of the transportation system. The initial phase of the project focused on government transportation programs to identify those practices which are most clearly applicable to OCRWM. These Federal programs have decades of safe transportation experience, strive for excellence in operations, and implement effective stakeholder involvement, all of which parallel OCRWM's transportation mission and vision. The initial benchmarking project focused on four business processes that are critical to OCRWM's mission success, and can be incorporated into OCRWM planning and preparation in the near term. The processes examined were: transportation business model, contract management/out-sourcing, stakeholder relations, and contingency planning. More recently, OCRWM examined logistics operations of AREVA NC's Business Unit Logistics in France. The next phase of benchmarking will focus on integrated domestic and international commercial radioactive logistic operations. The prospective companies represent large scale shippers and have vast experience in safely and efficiently shipping spent nuclear fuel and other radioactive materials. Additional business processes may be examined in this phase. The findings of these benchmarking efforts will help determine the organizational structure and requirements of the national transportation system. (authors)« less
Transfer Ionization Studies for Proton on He - new Inside into the World of Correlation
NASA Astrophysics Data System (ADS)
Schmidt-Böcking, Horst
2005-04-01
Correlated many-particle dynamics in Coulombic systems, which is one of the unsolved fundamental problems in AMO-physics, can now be experimentally approached with so far unprecedented completeness and precision. The recent development of the COLTRIMS technique (COLd Target Recoil Ion Momentum Spectroscopy) provides a coincident multi-fragment imaging technique for eV and sub-eV fragment detection. In its completeness it is as powerful as the bubble chamber in high energy physics. In recent benchmark experiments quasi snapshots (duration as short an atto-sec) of the correlated dynamics between electrons and nuclei has been made for atomic and molecular objects. This new imaging technique has opened a powerful observation window into the hidden world of many-particle dynamics. Recent transfer ionization studies will be presented and the direct observation of correlated electron pairs will be discussed.
Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies
Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.; ...
2018-06-20
Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less
Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.
Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less
Experimental physics characteristics of a heavy-metal-reflected fast-spectrum critical assembly
NASA Technical Reports Server (NTRS)
Heneveld, W. H.; Paschall, R. K.; Springer, T. H.; Swanson, V. A.; Thiele, A. W.; Tuttle, R. J.
1972-01-01
A zero-power critical assembly was designed, constructed, and operated for the purpose of conducting a series of benchmark experiments dealing with the physics characteristics of a UN-fueled, Li-cooled, Mo-reflected, drum-controlled compact fast reactor for use with a space-power electric conversion system. The range of the previous experimental investigations has been expanded to include the reactivity effects of:(1) surrounding the reactor with 15.24 cm (6 in.) of polyethylene, (2) reducing the heights of a portion of the upper and lower axial reflectors by factors of 2 and 4, (3) adding 45 kg of W to the core uniformly in two steps, (4) adding 9.54 kg of Ta to the core uniformly, and (5) inserting 2.3 kg of polyethylene into the core proper and determining the effect of a Ta addition on the polyethylene worth.
Tavakoli, Mohammad Bagher; Reiazi, Reza; Mohammadi, Mohammad Mehdi; Jabbari, Keyvan
2015-01-01
After proposing the idea of antiproton cancer treatment in 1984 many experiments were launched to investigate different aspects of physical and radiobiological properties of antiproton, which came from its annihilation reactions. One of these experiments has been done at the European Organization for Nuclear Research known as CERN using the antiproton decelerator. The ultimate goal of this experiment was to assess the dosimetric and radiobiological properties of beams of antiprotons in order to estimate the suitability of antiprotons for radiotherapy. One difficulty on this way was the unavailability of antiproton beam in CERN for a long time, so the verification of Monte Carlo codes to simulate antiproton depth dose could be useful. Among available simulation codes, Geant4 provides acceptable flexibility and extensibility, which progressively lead to the development of novel Geant4 applications in research domains, especially modeling the biological effects of ionizing radiation at the sub-cellular scale. In this study, the depth dose corresponding to CERN antiproton beam energy by Geant4 recruiting all the standard physics lists currently available and benchmarked for other use cases were calculated. Overall, none of the standard physics lists was able to draw the antiproton percentage depth dose. Although, with some models our results were promising, the Bragg peak level remained as the point of concern for our study. It is concluded that the Bertini model with high precision neutron tracking (QGSP_BERT_HP) is the best to match the experimental data though it is also the slowest model to simulate events among the physics lists.
Del Guerra, Alberto; Bardies, Manuel; Belcari, Nicola; Caruana, Carmel J; Christofides, Stelios; Erba, Paola; Gori, Cesare; Lassmann, Michael; Lonsdale, Markus Nowak; Sattler, Bernhard; Waddington, Wendy
2013-03-01
To provide a guideline curriculum covering theoretical and practical aspects of education and training for Medical Physicists in Nuclear Medicine within Europe. National training programmes of Medical Physics, Radiation Physics and Nuclear Medicine physics from a range of European countries and from North America were reviewed and elements of best practice identified. An independent panel of experts was used to achieve consensus regarding the content of the curriculum. Guidelines have been developed for the specialist theoretical knowledge and practical experience required to practice as a Medical Physicist in Nuclear Medicine in Europe. It is assumed that the precondition for the beginning of the training is a good initial degree in Medical Physics at master level (or equivalent). The Learning Outcomes are categorised using the Knowledge, Skill and Competence approach along the lines recommended by the European Qualifications Framework. The minimum level expected in each topic in the theoretical knowledge and practical experience sections is intended to bring trainees up to the requirements expected of a Medical Physicist entering the field of Nuclear Medicine. This new joint EANM/EFOMP European guideline curriculum is a further step to harmonise specialist training of Medical Physicists in Nuclear Medicine within Europe. It provides a common framework for national Medical Physics societies to develop or benchmark their own curricula. The responsibility for the implementation and accreditation of these standards and guidelines resides within national training and regulatory bodies. Copyright © 2012 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Mitchell, L
1996-01-01
The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.
Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...
2015-12-21
This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less
Nations that develop water quality benchmark values have relied primarily on standard data and methods. However, experience with chemicals such as Se, ammonia, and tributyltin has shown that standard methods do not adequately address some taxa, modes of exposure and effects. Deve...
Benchmarking Academic Libraries: An Australian Case Study.
ERIC Educational Resources Information Center
Robertson, Margaret; Trahn, Isabella
1997-01-01
Discusses experiences and outcomes of benchmarking at the Queensland University of Technology (Australia) library that compared acquisitions, cataloging, document delivery, and research support services with those of the University of New South Wales. Highlights include results as a catalyst for change, and the use of common output and performance…
ERIC Educational Resources Information Center
Leppisaari, Irja; Vainio, Leena; Herrington, Jan; Im, Yeonwook
2011-01-01
More and more, social technologies and virtual work methods are facilitating new ways of crossing boundaries in professional development and international collaborations. This paper examines the peer development of higher education teachers through the experiences of the IVBM project (International Virtual Benchmarking, 2009-2010). The…
Nations that develop water quality benchmark values have relied primarily on standard data and methods. However, experience with chemicals such as Se, ammonia, and tributyltin has shown that standard methods do not adequately address some taxa, modes of exposure and effects. Deve...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1994-09-01
One of the initial stages in ecological risk assessments for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as {open_quotes}contaminants of potential concern.{close_quotes} This process is termed {open_quotes}contaminant screening.{close_quotes} It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to soil- and litter-dwelling invertebrates, including earthworms, other micro- and macroinvertebrates, or heterotrophic bacteria and fungi. This report presents a standard method for deriving benchmarks for this purpose, sets of data concerningmore » effects of chemicals in soil on invertebrates and soil microbial processes, and benchmarks for chemicals potentially associated with United States Department of Energy sites. In addition, literature describing the experiments from which data were drawn for benchmark derivation. Chemicals that are found in soil at concentrations exceeding both the benchmarks and the background concentration for the soil type should be considered contaminants of potential concern.« less
Test One to Test Many: A Unified Approach to Quantum Benchmarks
NASA Astrophysics Data System (ADS)
Bai, Ge; Chiribella, Giulio
2018-04-01
Quantum benchmarks are routinely used to validate the experimental demonstration of quantum information protocols. Many relevant protocols, however, involve an infinite set of input states, of which only a finite subset can be used to test the quality of the implementation. This is a problem, because the benchmark for the finitely many states used in the test can be higher than the original benchmark calculated for infinitely many states. This situation arises in the teleportation and storage of coherent states, for which the benchmark of 50% fidelity is commonly used in experiments, although finite sets of coherent states normally lead to higher benchmarks. Here, we show that the average fidelity over all coherent states can be indirectly probed with a single setup, requiring only two-mode squeezing, a 50-50 beam splitter, and homodyne detection. Our setup enables a rigorous experimental validation of quantum teleportation, storage, amplification, attenuation, and purification of noisy coherent states. More generally, we prove that every quantum benchmark can be tested by preparing a single entangled state and measuring a single observable.
Benchmark gamma-ray skyshine experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nason, R.R.; Shultis, J.K.; Faw, R.E.
1982-01-01
A benchmark gamma-ray skyshine experiment is descibed in which /sup 60/Co sources were either collimated into an upward 150-deg conical beam or shielded vertically by two different thicknesses of concrete. A NaI(Tl) spectrometer and a high pressure ion chamber were used to measure, respectively, the energy spectrum and the 4..pi..-exposure rate of the air-reflected gamma photons up to 700 m from the source. Analyses of the data and comparison to DOT discrete ordinates calculations are presented.
Material Activation Benchmark Experiments at the NuMI Hadron Absorber Hall in Fermilab
NASA Astrophysics Data System (ADS)
Matsumura, H.; Matsuda, N.; Kasugai, Y.; Toyoda, A.; Yashima, H.; Sekimoto, S.; Iwase, H.; Oishi, K.; Sakamoto, Y.; Nakashima, H.; Leveling, A.; Boehnlein, D.; Lauten, G.; Mokhov, N.; Vaziri, K.
2014-06-01
In our previous study, double and mirror symmetric activation peaks found for Al and Au arranged spatially on the back of the Hadron absorber of the NuMI beamline in Fermilab were considerably higher than those expected purely from muon-induced reactions. From material activation bench-mark experiments, we conclude that this activation is due to hadrons with energy greater than 3 GeV that had passed downstream through small gaps in the hadron absorber.
Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester
There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less
XWeB: The XML Warehouse Benchmark
NASA Astrophysics Data System (ADS)
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
Statistical Analysis of NAS Parallel Benchmarks and LINPACK Results
NASA Technical Reports Server (NTRS)
Meuer, Hans-Werner; Simon, Horst D.; Strohmeier, Erich; Lasinski, T. A. (Technical Monitor)
1994-01-01
In the last three years extensive performance data have been reported for parallel machines both based on the NAS Parallel Benchmarks, and on LINPACK. In this study we have used the reported benchmark results and performed a number of statistical experiments using factor, cluster, and regression analyses. In addition to the performance results of LINPACK and the eight NAS parallel benchmarks, we have also included peak performance of the machine, and the LINPACK n and n(sub 1/2) values. Some of the results and observations can be summarized as follows: 1) All benchmarks are strongly correlated with peak performance. 2) LINPACK and EP have each a unique signature. 3) The remaining NPB can grouped into three groups as follows: (CG and IS), (LU and SP), and (MG, FT, and BT). Hence three (or four with EP) benchmarks are sufficient to characterize the overall NPB performance. Our poster presentation will follow a standard poster format, and will present the data of our statistical analysis in detail.
RERTR-12 Post-irradiation Examination Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rice, Francine; Williams, Walter; Robinson, Adam
2015-02-01
The following report contains the results and conclusions for the post irradiation examinations performed on RERTR-12 Insertion 2 experiment plates. These exams include eddy-current testing to measure oxide growth; neutron radiography for evaluating the condition of the fuel prior to sectioning and determination of fuel relocation and geometry changes; gamma scanning to provide relative measurements for burnup and indication of fuel- and fission-product relocation; profilometry to measure dimensional changes of the fuel plate; analytical chemistry to benchmark the physics burnup calculations; metallography to examine the microstructural changes in the fuel, interlayer and cladding; and microhardness testing to determine the material-propertymore » changes of the fuel and cladding.« less
The Surge Capacity for People in Emergencies (SCOPE) study in Australasian hospitals.
Traub, Matthias; Bradt, David A; Joseph, Anthony P
2007-04-16
To measure physical assets in Australasian hospitals required for the management of mass casualties as a result of terrorism or natural disasters. A cross-sectional survey of Australian and New Zealand hospitals. All emergency department directors of Australasian College for Emergency Medicine (ACEM)-accredited hospitals, as well as private and non-ACEM accredited emergency departments staffed by ACEM Fellows in metropolitan Sydney. Numbers of operating theatres, intensive care unit (ICU) beds and x-ray machines; state of preparedness using benchmarks defined by the Centers for Disease Control and Prevention in the United States. We found that 61%-82% of critically injured patients would not have immediate access to operative care, 34%-70% would have delayed access to an ICU bed, and 42% of the less critically injured would have delayed access to x-ray facilities. Our study demonstrates that physical assets in Australasian public hospitals do not meet US hospital preparedness benchmarks for mass casualty incidents. We recommend national agreement on disaster preparedness benchmarks and periodic publication of hospital performance indicators to enhance disaster preparedness.
NASA Astrophysics Data System (ADS)
Wilhelm, Jennifer Anne
This case study examined what student content understanding could occur in an inner city Industrial Electronics classroom located at Tree High School where project-based instruction, enhanced with technology, was implemented for the first time. Students participated in a project implementation unit involving sound waves and trigonometric reasoning. The unit was designed to foster common content learning (via benchmark lessons) by all students in the class, and to help students gain a deeper conceptual understanding of a sub-set of the larger content unit (via group project research). The objective goal of the implementation design unit was to have students gain conceptual understanding of sound waves, such as what actually waves in a wave, how waves interfere with one another, and what affects the speed of a wave. This design unit also intended for students to develop trigonometric reasoning associated with sinusoidal curves and superposition of sinusoidal waves. Project criteria within this design included implementation features, such as the need for the student to have a driving research question and focus, the need for benchmark lessons to help foster and scaffold content knowledge and understanding, and the need for project milestones to complete throughout the implementation unit to allow students the time for feedback and revision. The Industrial Electronics class at Tree High School consisted of nine students who met daily during double class periods giving 100 minutes of class time per day. The class teacher had been teaching for 18 years (mathematics, physics, and computer science). He had a background in engineering and experience teaching at the college level. Benchmark activities during implementation were used to scaffold fundamental ideas and terminology needed to investigate characteristics of sound and waves. Students participating in benchmark activities analyzed motion and musical waveforms using probeware, and explored wave phenomena using waves simulation software. Benchmark activities were also used to bridge the ideas of triangle trigonometric ratios to the graphs of sinusoidal curves, which could lead to understanding the concepts of frequency, period, amplitude, and wavelength. (Abstract shortened by UMI.)
Neutrino Factory Targets and the MICE Beam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walaron, Kenneth Andrew
2007-01-01
The future of particle physics in the next 30 years must include detailed study of neutrinos. The first proof of physics beyond the Standard Model of particle physics is evident in results from recent neutrino experiments which imply that neutrinos have mass and flavour mixing. The Neutrino Factory is the leading contender to measure precisely the neutrino mixing parameters to probe beyond the Standard Model physics. Significantly, one must look to measure the mixing angle θ 13 and investigate the possibility of leptonic CP violation. If found this may provide a key insight into the origins of the matter/anti- mattermore » asymmetry seen in the universe, through the mechanism of leptogenesis. The Neutrino Factory will be a large international multi-billion dollar experiment combining novel new accelerator and long-baseline detector technology. Arguably the most important and costly features of this facility are the proton driver and cooling channel. This thesis will present simulation work focused on determining the optimal proton driver energy to maximise pion production and also simulation of the transport of this pion °ux through some candidate transport lattices. Bench-marking of pion cross- sections calculated by MARS and GEANT4 codes to measured data from the HARP experiment is also presented. The cooling channel aims to reduce the phase-space volume of the decayed muon beam to a level that can be e±ciently injected into the accelerator system. The Muon Ionisation Cooling Experiment (MICE) hosted by the Rutherford Appleton laboratory, UK is a proof-of-principle experiment aimed at measuring ionisation cooling. The experiment will run parasitically to the ISIS accelerator and will produce muons from pion decay. The MICE beamline provides muon beams of variable emittance and momentum to the MICE experiment to enable measurement of cooling over a wide range of beam conditions. Simulation work in the design of this beamline is presented in this thesis as are results from an experiment to estimate the °ux from the target into the beamline acceptance.« less
A benchmark initiative on mantle convection with melting and melt segregation
NASA Astrophysics Data System (ADS)
Schmeling, Harro; Dohmen, Janik; Wallner, Herbert; Noack, Lena; Tosi, Nicola; Plesa, Ana-Catalina; Maurice, Maxime
2015-04-01
In recent years a number of mantle convection models have been developed which include partial melting within the asthenosphere, estimation of melt volumes, as well as melt extraction with and without redistribution at the surface or within the lithosphere. All these approaches use various simplifying modelling assumptions whose effects on the dynamics of convection including the feedback on melting have not been explored in sufficient detail. To better assess the significance of such assumptions and to provide test cases for the modelling community we initiate a benchmark comparison. In the initial phase of this endeavor we focus on the usefulness of the definitions of the test cases keeping the physics as sound as possible. The reference model is taken from the mantle convection benchmark, case 1b (Blanckenbach et al., 1989), assuming a square box with free slip boundary conditions, the Boussinesq approximation, constant viscosity and a Rayleigh number of 1e5. Melting is modelled assuming a simplified binary solid solution with linearly depth dependent solidus and liquidus temperatures, as well as a solidus temperature depending linearly on depletion. Starting from a plume free initial temperature condition (to avoid melting at the onset time) three cases are investigated: Case 1 includes melting, but without thermal or dynamic feedback on the convection flow. This case provides a total melt generation rate (qm) in a steady state. Case 2 includes batch melting, melt buoyancy (melt Rayleigh number Rm), depletion buoyancy and latent heat, but no melt percolation. Output quantities are the Nusselt number (Nu), root mean square velocity (vrms) and qm approaching a statistical steady state. Case 3 includes two-phase flow, i.e. melt percolation, assuming a constant shear and bulk viscosity of the matrix and various melt retention numbers (Rt). These cases should be carried out using the Compaction Boussinseq Approximation (Schmeling, 2000) or the full compaction formulation. Variations of cases 1 - 3 may be tested, particularly studying the effect of melt extraction. The motivation of this presentation is to summarize first experiences, suggest possible modifications of the case definitions and call interested modelers to join this benchmark exercise. References: Blanckenbach, B., Busse, F., Christensen, U., Cserepes, L. Gun¬kel, D., Hansen, U., Har¬der, H. Jarvis, G., Koch, M., Mar¬quart, G., Moore D., Olson, P., and Schmeling, H., 1989: A benchmark comparison for mantle convection codes, J. Geo¬phys., 98, 23 38. Schmeling, H., 2000: Partial melting and melt segregation in a convecting mantle. In: Physics and Chemistry of Partially Molten Rocks, eds. N. Bagdassarov, D. Laporte, and A.B. Thompson, Kluwer Academic Publ., Dordrecht, pp. 141 - 178.
Benchmarking and Threshold Standards in Higher Education. Staff and Educational Development Series.
ERIC Educational Resources Information Center
Smith, Helen, Ed.; Armstrong, Michael, Ed.; Brown, Sally, Ed.
This book explores the issues involved in developing standards in higher education, examining the practical issues involved in benchmarking and offering a critical analysis of the problems associated with this developmental tool. The book focuses primarily on experience in the United Kingdom (UK), but looks also at international activity in this…
Internal Quality Assurance Benchmarking. ENQA Workshop Report 20
ERIC Educational Resources Information Center
Blackstock, Douglas; Burquel, Nadine; Comet, Nuria; Kajaste, Matti; dos Santos, Sergio Machado; Marcos, Sandra; Moser, Marion; Ponds, Henri; Scheuthle, Harald; Sixto, Luis Carlos Velon
2012-01-01
The Internal Quality Assurance group of ENQA (IQA Group) has been organising a yearly seminar for its members since 2007. The main objective is to share experiences concerning the internal quality assurance of work processes in the participating agencies. The overarching theme of the 2011 seminar was how to use benchmarking as a tool for…
Suwazono, Yasushi; Dochi, Mirei; Kobayashi, Etsuko; Oishi, Mitsuhiro; Okubo, Yasushi; Tanaka, Kumihiko; Sakata, Kouichi
2008-12-01
The objective of this study was to calculate benchmark durations and lower 95% confidence limits for benchmark durations of working hours associated with subjective fatigue symptoms by applying the benchmark dose approach while adjusting for job-related stress using multiple logistic regression analyses. A self-administered questionnaire was completed by 3,069 male and 412 female daytime workers (age 18-67 years) in a Japanese steel company. The eight dependent variables in the Cumulative Fatigue Symptoms Index were decreased vitality, general fatigue, physical disorders, irritability, decreased willingness to work, anxiety, depressive feelings, and chronic tiredness. Independent variables were daily working hours, four subscales (job demand, job control, interpersonal relationship, and job suitability) of the Brief Job Stress Questionnaire, and other potential covariates. Using significant parameters for working hours and those for other covariates, the benchmark durations of working hours were calculated for the corresponding Index property. Benchmark response was set at 5% or 10%. Assuming a condition of worst job stress, the benchmark duration/lower 95% confidence limit for benchmark duration of working hours per day with a benchmark response of 5% or 10% were 10.0/9.4 or 11.7/10.7 (irritability) and 9.2/8.9 or 10.4/9.8 (chronic tiredness) in men and 8.9/8.4 or 9.8/8.9 (chronic tiredness) in women. The threshold amounts of working hours for fatigue symptoms under the worst job-related stress were very close to the standard daily working hours in Japan. The results strongly suggest that special attention should be paid to employees whose working hours exceed threshold amounts based on individual levels of job-related stress.
Exploring New Physics Frontiers Through Numerical Relativity.
Cardoso, Vitor; Gualtieri, Leonardo; Herdeiro, Carlos; Sperhake, Ulrich
2015-01-01
The demand to obtain answers to highly complex problems within strong-field gravity has been met with significant progress in the numerical solution of Einstein's equations - along with some spectacular results - in various setups. We review techniques for solving Einstein's equations in generic spacetimes, focusing on fully nonlinear evolutions but also on how to benchmark those results with perturbative approaches. The results address problems in high-energy physics, holography, mathematical physics, fundamental physics, astrophysics and cosmology.
Benchmarking the SPHINX and CTH shock physics codes for three problems in ballistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, L.T.; Hertel, E.; Schwalbe, L.
1998-02-01
The CTH Eulerian hydrocode, and the SPHINX smooth particle hydrodynamics (SPH) code were used to model a shock tube, two long rod penetrations into semi-infinite steel targets, and a long rod penetration into a spaced plate array. The results were then compared to experimental data. Both SPHINX and CTH modeled the one-dimensional shock tube problem well. Both codes did a reasonable job in modeling the outcome of the axisymmetric rod impact problem. Neither code correctly reproduced the depth of penetration in both experiments. In the 3-D problem, both codes reasonably replicated the penetration of the rod through the first plate.more » After this, however, the predictions of both codes began to diverge from the results seen in the experiment. In terms of computer resources, the run times are problem dependent, and are discussed in the text.« less
Upgrades for the CMS simulation
Lange, D. J.; Hildreth, M.; Ivantchenko, V. N.; ...
2015-05-22
Over the past several years, the CMS experiment has made significant changes to its detector simulation application. The geometry has been generalized to include modifications being made to the CMS detector for 2015 operations, as well as model improvements to the simulation geometry of the current CMS detector and the implementation of a number of approved and possible future detector configurations. These include both completely new tracker and calorimetry systems. We have completed the transition to Geant4 version 10, we have made significant progress in reducing the CPU resources required to run our Geant4 simulation. These have been achieved throughmore » both technical improvements and through numerical techniques. Substantial speed improvements have been achieved without changing the physics validation benchmarks that the experiment uses to validate our simulation application for use in production. As a result, we will discuss the methods that we implemented and the corresponding demonstrated performance improvements deployed for our 2015 simulation application.« less
Closed-Loop Neuromorphic Benchmarks
Stewart, Terrence C.; DeWolf, Travis; Kleinhans, Ashley; Eliasmith, Chris
2015-01-01
Evaluating the effectiveness and performance of neuromorphic hardware is difficult. It is even more difficult when the task of interest is a closed-loop task; that is, a task where the output from the neuromorphic hardware affects some environment, which then in turn affects the hardware's future input. However, closed-loop situations are one of the primary potential uses of neuromorphic hardware. To address this, we present a methodology for generating closed-loop benchmarks that makes use of a hybrid of real physical embodiment and a type of “minimal” simulation. Minimal simulation has been shown to lead to robust real-world performance, while still maintaining the practical advantages of simulation, such as making it easy for the same benchmark to be used by many researchers. This method is flexible enough to allow researchers to explicitly modify the benchmarks to identify specific task domains where particular hardware excels. To demonstrate the method, we present a set of novel benchmarks that focus on motor control for an arbitrary system with unknown external forces. Using these benchmarks, we show that an error-driven learning rule can consistently improve motor control performance across a randomly generated family of closed-loop simulations, even when there are up to 15 interacting joints to be controlled. PMID:26696820
Developing a molecular dynamics force field for both folded and disordered protein states.
Robustelli, Paul; Piana, Stefano; Shaw, David E
2018-05-07
Molecular dynamics (MD) simulation is a valuable tool for characterizing the structural dynamics of folded proteins and should be similarly applicable to disordered proteins and proteins with both folded and disordered regions. It has been unclear, however, whether any physical model (force field) used in MD simulations accurately describes both folded and disordered proteins. Here, we select a benchmark set of 21 systems, including folded and disordered proteins, simulate these systems with six state-of-the-art force fields, and compare the results to over 9,000 available experimental data points. We find that none of the tested force fields simultaneously provided accurate descriptions of folded proteins, of the dimensions of disordered proteins, and of the secondary structure propensities of disordered proteins. Guided by simulation results on a subset of our benchmark, however, we modified parameters of one force field, achieving excellent agreement with experiment for disordered proteins, while maintaining state-of-the-art accuracy for folded proteins. The resulting force field, a99SB- disp , should thus greatly expand the range of biological systems amenable to MD simulation. A similar approach could be taken to improve other force fields. Copyright © 2018 the Author(s). Published by PNAS.
Interface Pattern Selection in Directional Solidification
NASA Technical Reports Server (NTRS)
Trivedi, Rohit; Tewari, Surendra N.
2001-01-01
The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.
Leckey, Cara A C; Wheeler, Kevin R; Hafiychuk, Vasyl N; Hafiychuk, Halyna; Timuçin, Doğan A
2018-03-01
Ultrasonic wave methods constitute the leading physical mechanism for nondestructive evaluation (NDE) and structural health monitoring (SHM) of solid composite materials, such as carbon fiber reinforced polymer (CFRP) laminates. Computational models of ultrasonic wave excitation, propagation, and scattering in CFRP composites can be extremely valuable in designing practicable NDE and SHM hardware, software, and methodologies that accomplish the desired accuracy, reliability, efficiency, and coverage. The development and application of ultrasonic simulation approaches for composite materials is an active area of research in the field of NDE. This paper presents comparisons of guided wave simulations for CFRP composites implemented using four different simulation codes: the commercial finite element modeling (FEM) packages ABAQUS, ANSYS, and COMSOL, and a custom code executing the Elastodynamic Finite Integration Technique (EFIT). Benchmark comparisons are made between the simulation tools and both experimental laser Doppler vibrometry data and theoretical dispersion curves. A pristine and a delamination type case (Teflon insert in the experimental specimen) is studied. A summary is given of the accuracy of simulation results and the respective computational performance of the four different simulation tools. Published by Elsevier B.V.
GPI Spectroscopy of the Mass, Age, and Metallicity Benchmark Brown Dwarf HD 4747 B
NASA Astrophysics Data System (ADS)
Crepp, Justin R.; Principe, David A.; Wolff, Schuyler; Giorla Godfrey, Paige A.; Rice, Emily L.; Cieza, Lucas; Pueyo, Laurent; Bechter, Eric B.; Gonzales, Erica J.
2018-02-01
The physical properties of brown dwarf companions found to orbit nearby, solar-type stars can be benchmarked against independent measures of their mass, age, chemical composition, and other parameters, offering insights into the evolution of substellar objects. The TRENDS high-contrast imaging survey has recently discovered a (mass/age/metallicity) benchmark brown dwarf orbiting the nearby (d = 18.69 ± 0.19 pc), G8V/K0V star HD 4747. We have acquired follow-up spectroscopic measurements of HD 4747 B using the Gemini Planet Imager to study its spectral type, effective temperature, surface gravity, and cloud properties. Observations obtained in the H-band and K 1-band recover the companion and reveal that it is near the L/T transition (T1 ± 2). Fitting atmospheric models to the companion spectrum, we find strong evidence for the presence of clouds. However, spectral models cannot satisfactorily fit the complete data set: while the shape of the spectrum can be well-matched in individual filters, a joint fit across the full passband results in discrepancies that are a consequence of the inherent color of the brown dwarf. We also find a 2σ tension in the companion mass, age, and surface gravity when comparing to evolutionary models. These results highlight the importance of using benchmark objects to study “secondary effects” such as metallicity, non-equilibrium chemistry, cloud parameters, electron conduction, non-adiabatic cooling, and other subtleties affecting emergent spectra. As a new L/T transition benchmark, HD 4747 B warrants further investigation into the modeling of cloud physics using higher resolution spectroscopy across a broader range of wavelengths, polarimetric observations, and continued Doppler radial velocity and astrometric monitoring.
Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples
NASA Astrophysics Data System (ADS)
Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.
2012-12-01
The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.
Staff confidence in dealing with aggressive patients: a benchmarking exercise.
McGowan, S; Wynaden, D; Harding, N; Yassine, A; Parker, J
1999-09-01
Interacting with potentially aggressive patients is a common occurrence for nurses working in psychiatric intensive care units. Although the literature highlights the need to educate staff in the prevention and management of aggression, often little, or no, training is provided by employers. This article describes a benchmarking exercise conducted in psychiatric intensive care units at two Western Australian hospitals to assess staff confidence in coping with patient aggression. Results demonstrated that staff in the hospital where regular training was undertaken were significantly more confident in dealing with aggression. Following the completion of a safe physical restraint module at the other hospital staff reported a significant increase in their level of confidence that either matched or bettered the results of their benchmark colleagues.
Open Rotor - Analysis of Diagnostic Data
NASA Technical Reports Server (NTRS)
Envia, Edmane
2011-01-01
NASA is researching open rotor propulsion as part of its technology research and development plan for addressing the subsonic transport aircraft noise, emission and fuel burn goals. The low-speed wind tunnel test for investigating the aerodynamic and acoustic performance of a benchmark blade set at the approach and takeoff conditions has recently concluded. A high-speed wind tunnel diagnostic test campaign has begun to investigate the performance of this benchmark open rotor blade set at the cruise condition. Databases from both speed regimes will comprise a comprehensive collection of benchmark open rotor data for use in assessing/validating aerodynamic and noise prediction tools (component & system level) as well as providing insights into the physics of open rotors to help guide the development of quieter open rotors.
Experimental Data from the Benchmark SuperCritical Wing Wind Tunnel Test on an Oscillating Turntable
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Piatak, David J.
2013-01-01
The Benchmark SuperCritical Wing (BSCW) wind tunnel model served as a semi-blind testcase for the 2012 AIAA Aeroelastic Prediction Workshop (AePW). The BSCW was chosen as a testcase due to its geometric simplicity and flow physics complexity. The data sets examined include unforced system information and forced pitching oscillations. The aerodynamic challenges presented by this AePW testcase include a strong shock that was observed to be unsteady for even the unforced system cases, shock-induced separation and trailing edge separation. The current paper quantifies these characteristics at the AePW test condition and at a suggested benchmarking test condition. General characteristics of the model's behavior are examined for the entire available data set.
ERIC Educational Resources Information Center
Mulvey, Patrick; Tyler, John; Nicholson, Starr; Ivie, Rachel
2017-01-01
This report provides data on the size of degree-granting physics and astronomy departments by examining the number of bachelor's degrees awarded and the number of full-time equivalent (FTE) faculty members employed. The benchmarking data in this report is intended to allow physics and astronomy departments to see how they fit in the national…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Der Marck, S. C.
Three nuclear data libraries have been tested extensively using criticality safety benchmark calculations. The three libraries are the new release of the US library ENDF/B-VII.1 (2011), the new release of the Japanese library JENDL-4.0 (2011), and the OECD/NEA library JEFF-3.1 (2006). All calculations were performed with the continuous-energy Monte Carlo code MCNP (version 4C3, as well as version 6-beta1). Around 2000 benchmark cases from the International Handbook of Criticality Safety Benchmark Experiments (ICSBEP) were used. The results were analyzed per ICSBEP category, and per element. Overall, the three libraries show similar performance on most criticality safety benchmarks. The largest differencesmore » are probably caused by elements such as Be, C, Fe, Zr, W. (authors)« less
Benchmarking initiatives in the water industry.
Parena, R; Smeets, E
2001-01-01
Customer satisfaction and service care are every day pushing professionals in the water industry to seek to improve their performance, lowering costs and increasing the provided service level. Process Benchmarking is generally recognised as a systematic mechanism of comparing one's own utility with other utilities or businesses with the intent of self-improvement by adopting structures or methods used elsewhere. The IWA Task Force on Benchmarking, operating inside the Statistics and Economics Committee, has been committed to developing a general accepted concept of Process Benchmarking to support water decision-makers in addressing issues of efficiency. In a first step the Task Force disseminated among the Committee members a questionnaire focused on providing suggestions about the kind, the evolution degree and the main concepts of Benchmarking adopted in the represented Countries. A comparison among the guidelines adopted in The Netherlands and Scandinavia has recently challenged the Task Force in drafting a methodology for a worldwide process benchmarking in water industry. The paper provides a framework of the most interesting benchmarking experiences in the water sector and describes in detail both the final results of the survey and the methodology focused on identification of possible improvement areas.
Evaluating virtual hosted desktops for graphics-intensive astronomy
NASA Astrophysics Data System (ADS)
Meade, B. F.; Fluke, C. J.
2018-04-01
Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.
Design and performance of an electromagnetic calorimeter for a FCC-hh experiment
NASA Astrophysics Data System (ADS)
Zaborowska, A.
2018-03-01
The physics reach and feasibility of the Future Circular Collider are currently under investigation. The goal is to collide protons with centre-of-mass energies up to 100 TeV, extending the research carried out at the current HEP facilities. The detectors designed for the FCC experiments need to tackle harsh conditions of the unprecedented collision energy and luminosity. The baseline technology for the calorimeter system of the FCC-hh detector is described. The electromagnetic calorimeter in the barrel, as well as the electromagnetic and hadronic calorimeters in the endcaps and the forward regions, are based on the liquid argon as active material. The detector layout in the barrel region combines the concept of a high granularity calorimeter with precise energy measurements. The calorimeters have to meet the requirements of high radiation hardness and must be able to deal with a very high number of collisions per bunch crossings (pile-up). A very good energy and angular resolution for a wide range of electrons' and photons' momentum is needed in order to meet the demands based on the physics benchmarks. First results of the performance studies with the new liquid argon calorimeter are presented, meeting the energy resolution goal.
Winning Strategy: Set Benchmarks of Early Success to Build Momentum for the Long Term
ERIC Educational Resources Information Center
Spiro, Jody
2012-01-01
Change is a highly personal experience. Everyone participating in the effort has different reactions to change, different concerns, and different motivations for being involved. The smart change leader sets benchmarks along the way so there are guideposts and pause points instead of an endless change process. "Early wins"--a term used to describe…
A benchmark for subduction zone modeling
NASA Astrophysics Data System (ADS)
van Keken, P.; King, S.; Peacock, S.
2003-04-01
Our understanding of subduction zones hinges critically on the ability to discern its thermal structure and dynamics. Computational modeling has become an essential complementary approach to observational and experimental studies. The accurate modeling of subduction zones is challenging due to the unique geometry, complicated rheological description and influence of fluid and melt formation. The complicated physics causes problems for the accurate numerical solution of the governing equations. As a consequence it is essential for the subduction zone community to be able to evaluate the ability and limitations of various modeling approaches. The participants of a workshop on the modeling of subduction zones, held at the University of Michigan at Ann Arbor, MI, USA in 2002, formulated a number of case studies to be developed into a benchmark similar to previous mantle convection benchmarks (Blankenbach et al., 1989; Busse et al., 1991; Van Keken et al., 1997). Our initial benchmark focuses on the dynamics of the mantle wedge and investigates three different rheologies: constant viscosity, diffusion creep, and dislocation creep. In addition we investigate the ability of codes to accurate model dynamic pressure and advection dominated flows. Proceedings of the workshop and the formulation of the benchmark are available at www.geo.lsa.umich.edu/~keken/subduction02.html We strongly encourage interested research groups to participate in this benchmark. At Nice 2003 we will provide an update and first set of benchmark results. Interested researchers are encouraged to contact one of the authors for further details.
Performance studies of the P barANDA planar GEM-tracking detector in physics simulations
NASA Astrophysics Data System (ADS)
Divani Veis, Nazila; Firoozabadi, Mohammad M.; Karabowicz, Radoslaw; Maas, Frank; Saito, Takehiko R.; Voss, Bernd; ̅PANDA Gem-Tracker Subgroup
2018-03-01
The P barANDA experiment will be installed at the future facility for antiproton and ion research (FAIR) in Darmstadt, Germany, to study events from the annihilation of protons and antiprotons. The P barANDA detectors can cover a wide physics program about baryon spectroscopy and nucleon structure as well as the study of hadrons and hypernuclear physics including the study of excited hyperon states. One very specific feature of most hyperon ground states is the long decay length of several centimeters in the forward direction. The central tracking detectors of the P barANDA setup are not sufficiently optimized for these long decay lengths. Therefore, using a set of the planar GEM-tracking detectors in the forward region of interest can improve the results in the hyperon physics-benchmark channel. The current conceptual designed P barANDA GEM-tracking stations contribute the measurement of the particles emitted in the polar angles between about 2 to 22 degrees. For this designed detector performance and acceptance, studies have been performed using one of the important hyperonic decay channel p bar p → Λ bar Λ → p bar pπ+π- in physics simulations. The simulations were carried out using the PandaRoot software packages based on the FairRoot framework.
Status of Plasma Electron Hose Instability Studies in FACET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adli, Erik; /U. Oslo; England, Robert Joel
In the FACET plasma-wakefield acceleration experiment a dense 23 GeV electron beam will interact with lithium and cesium plasmas, leading to plasma ion-channel formation. The interaction between the electron beam and the plasma sheath-electrons may lead to a fast growing electron hose instability. By using optics dispersion knobs to induce a controlled z-x tilt along the beam entering the plasma, we investigate the transverse behavior of the beam in the plasma as function of the tilt. We seek to quantify limits on the instability in order to further explore potential limitations on future plasma wakefield accelerators due to the electronmore » hose instability. The FACET plasma-wakefield experiment at SLAC will study beam driven plasma wakefield acceleration. A dense 23 GeV electron beam will interact with lithium or cesium plasma, leading to plasma ion-channel formation. The interaction between the electron beam and the plasma sheath-electrons drives the electron hose instability, as first studied by Whittum. While Ref. [2] indicates the possibility of a large instability growth rate for typical beam and plasma parameters, other studies including have shown that several physical effects may mitigate the hosing growth rate substantially. So far there has been no quantitative benchmarking of experimentally observed hosing in previous experiments. At FACET we aim to perform such benchmarking by for example inducing a controlled z-x tilt along the beamentering the plasma, and observing the transverse behavior of the beam in the plasma as function. The long-term objective of these studies is to quantify potential limitations on future plasma wakefield accelerators due to the electron hose instability.« less
Quality management benchmarking: FDA compliance in pharmaceutical industry.
Jochem, Roland; Landgraf, Katja
2010-01-01
By analyzing and comparing industry and business best practice, processes can be optimized and become more successful mainly because efficiency and competitiveness increase. This paper aims to focus on some examples. Case studies are used to show knowledge exchange in the pharmaceutical industry. Best practice solutions were identified in two companies using a benchmarking method and five-stage model. Despite large administrations, there is much potential regarding business process organization. This project makes it possible for participants to fully understand their business processes. The benchmarking method gives an opportunity to critically analyze value chains (a string of companies or players working together to satisfy market demands for a special product). Knowledge exchange is interesting for companies that like to be global players. Benchmarking supports information exchange and improves competitive ability between different enterprises. Findings suggest that the five-stage model improves efficiency and effectiveness. Furthermore, the model increases the chances for reaching targets. The method gives security to partners that did not have benchmarking experience. The study identifies new quality management procedures. Process management and especially benchmarking is shown to support pharmaceutical industry improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, Timothy P.; Martz, Roger L.; Kiedrowski, Brian C.
New unstructured mesh capabilities in MCNP6 (developmental version during summer 2012) show potential for conducting multi-physics analyses by coupling MCNP to a finite element solver such as Abaqus/CAE[2]. Before these new capabilities can be utilized, the ability of MCNP to accurately estimate eigenvalues and pin powers using an unstructured mesh must first be verified. Previous work to verify the unstructured mesh capabilities in MCNP was accomplished using the Godiva sphere [1], and this work attempts to build on that. To accomplish this, a criticality benchmark and a fuel assembly benchmark were used for calculations in MCNP using both the Constructivemore » Solid Geometry (CSG) native to MCNP and the unstructured mesh geometry generated using Abaqus/CAE. The Big Ten criticality benchmark [3] was modeled due to its geometry being similar to that of a reactor fuel pin. The C5G7 3-D Mixed Oxide (MOX) Fuel Assembly Benchmark [4] was modeled to test the unstructured mesh capabilities on a reactor-type problem.« less
Gude, Wouter T; van Engen-Verheul, Mariëtte M; van der Veer, Sabine N; de Keizer, Nicolette F; Peek, Niels
2017-04-01
To identify factors that influence the intentions of health professionals to improve their practice when confronted with clinical performance feedback, which is an essential first step in the audit and feedback mechanism. We conducted a theory-driven laboratory experiment with 41 individual professionals, and a field study in 18 centres in the context of a cluster-randomised trial of electronic audit and feedback in cardiac rehabilitation. Feedback reports were provided through a web-based application, and included performance scores and benchmark comparisons (high, intermediate or low performance) for a set of process and outcome indicators. From each report participants selected indicators for improvement into their action plan. Our unit of observation was an indicator presented in a feedback report (selected yes/no); we considered selecting an indicator to reflect an intention to improve. We analysed 767 observations in the laboratory experiment and 614 in the field study, respectively. Each 10% decrease in performance score increased the probability of an indicator being selected by 54% (OR, 1.54; 95% CI 1.29% to 1.83%) in the laboratory experiment, and 25% (OR, 1.25; 95% CI 1.13% to 1.39%) in the field study. Also, performance being benchmarked as low and intermediate increased this probability in laboratory settings. Still, participants ignored the benchmarks in 34% (laboratory experiment) and 48% (field study) of their selections. When confronted with clinical performance feedback, performance scores and benchmark comparisons influenced health professionals' intentions to improve practice. However, there was substantial variation in these intentions, because professionals disagreed with benchmarks, deemed improvement unfeasible or did not consider the indicator an essential aspect of care quality. These phenomena impede intentions to improve practice, and are thus likely to dilute the effects of audit and feedback interventions. NTR3251, pre-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; McMahan, Kimberly L.
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 11, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
Wu, Zhenqin; Ramsundar, Bharath; Feinberg, Evan N.; Gomes, Joseph; Geniesse, Caleb; Pappu, Aneesh S.; Leswing, Karl
2017-01-01
Molecular machine learning has been maturing rapidly over the last few years. Improved methods and the presence of larger datasets have enabled machine learning algorithms to make increasingly accurate predictions about molecular properties. However, algorithmic progress has been limited due to the lack of a standard benchmark to compare the efficacy of proposed methods; most new algorithms are benchmarked on different datasets making it challenging to gauge the quality of proposed methods. This work introduces MoleculeNet, a large scale benchmark for molecular machine learning. MoleculeNet curates multiple public datasets, establishes metrics for evaluation, and offers high quality open-source implementations of multiple previously proposed molecular featurization and learning algorithms (released as part of the DeepChem open source library). MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance. However, this result comes with caveats. Learnable representations still struggle to deal with complex tasks under data scarcity and highly imbalanced classification. For quantum mechanical and biophysical datasets, the use of physics-aware featurizations can be more important than choice of particular learning algorithm. PMID:29629118
Benchmarking the Multidimensional Stellar Implicit Code MUSIC
NASA Astrophysics Data System (ADS)
Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.
2017-04-01
We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.
Noninterceptive transverse emittance measurements using BPM for Chinese ADS R&D project
NASA Astrophysics Data System (ADS)
Wang, Zhi-Jun; Feng, Chi; He, Yuan; Dou, Weiping; Tao, Yue; Chen, Wei-long; Jia, Huan; Liu, Shu-hui; Wang, Wang-sheng; Zhang, Yong; Wu, Jian-qiang; Zhang, Sheng-hu; Zhang, X. L.
2016-04-01
The noninterceptive four-dimensional transverse emittance measurements are essential for commissioning the high power continue-wave (CW) proton linacs as well as their operations. The conventional emittance measuring devices such as slits and wire scanners are not well suited under these conditions due to sure beam damages. Therefore, the method of using noninterceptive Beam Position Monitor (BPM) is developed and demonstrated on Injector Scheme II at the Chinese Accelerator Driven Sub-critical System (China-ADS) proofing facility inside Institute of Modern Physics (IMP) [1]. The results of measurements are in good agreements with wire scanners and slits at low duty-factor pulsed (LDFP) beam. In this paper, the detailed experiment designs, data analysis and result benchmarking are presented.
PSO algorithm enhanced with Lozi Chaotic Map - Tuning experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan
2015-03-10
In this paper it is investigated the effect of tuning of control parameters of the Lozi Chaotic Map employed as a chaotic pseudo-random number generator for the particle swarm optimization algorithm. Three different benchmark functions are selected from the IEEE CEC 2013 competition benchmark set. The Lozi map is extensively tuned and the performance of PSO is evaluated.
Baseline Architecture of ITER Control System
NASA Astrophysics Data System (ADS)
Wallander, A.; Di Maio, F.; Journeaux, J.-Y.; Klotz, W.-D.; Makijarvi, P.; Yonekawa, I.
2011-08-01
The control system of ITER consists of thousands of computers processing hundreds of thousands of signals. The control system, being the primary tool for operating the machine, shall integrate, control and coordinate all these computers and signals and allow a limited number of staff to operate the machine from a central location with minimum human intervention. The primary functions of the ITER control system are plant control, supervision and coordination, both during experimental pulses and 24/7 continuous operation. The former can be split in three phases; preparation of the experiment by defining all parameters; executing the experiment including distributed feed-back control and finally collecting, archiving, analyzing and presenting all data produced by the experiment. We define the control system as a set of hardware and software components with well defined characteristics. The architecture addresses the organization of these components and their relationship to each other. We distinguish between physical and functional architecture, where the former defines the physical connections and the latter the data flow between components. In this paper, we identify the ITER control system based on the plant breakdown structure. Then, the control system is partitioned into a workable set of bounded subsystems. This partition considers at the same time the completeness and the integration of the subsystems. The components making up subsystems are identified and defined, a naming convention is introduced and the physical networks defined. Special attention is given to timing and real-time communication for distributed control. Finally we discuss baseline technologies for implementing the proposed architecture based on analysis, market surveys, prototyping and benchmarking carried out during the last year.
Benchmarking infrastructure for mutation text mining
2014-01-01
Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600
Benchmarking infrastructure for mutation text mining.
Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo
2014-02-25
Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.
Qualifying for the Green500: Experience with the newest generation of supercomputers at LANL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilk, Todd
The High Performance Computing Division of Los Alamos National Laboratory recently brought four new supercomputing platforms on line: Trinity with separate partitions built around the Haswell and Knights Landing CPU architectures for capability computing and Grizzly, Fire, and Ice for capacity computing applications. The power monitoring infrastructure of these machines is significantly enhanced over previous supercomputing generations at LANL and all were qualified at the highest level of the Green500 benchmark. Here, this paper discusses supercomputing at LANL, the Green500 benchmark, and notes on our experience meeting the Green500's reporting requirements.
Qualifying for the Green500: Experience with the newest generation of supercomputers at LANL
Yilk, Todd
2018-02-17
The High Performance Computing Division of Los Alamos National Laboratory recently brought four new supercomputing platforms on line: Trinity with separate partitions built around the Haswell and Knights Landing CPU architectures for capability computing and Grizzly, Fire, and Ice for capacity computing applications. The power monitoring infrastructure of these machines is significantly enhanced over previous supercomputing generations at LANL and all were qualified at the highest level of the Green500 benchmark. Here, this paper discusses supercomputing at LANL, the Green500 benchmark, and notes on our experience meeting the Green500's reporting requirements.
Short-Term Forecasts Using NU-WRF for the Winter Olympics 2018
NASA Technical Reports Server (NTRS)
Srikishen, Jayanthi; Case, Jonathan L.; Petersen, Walter A.; Iguchi, Takamichi; Tao, Wei-Kuo; Zavodsky, Bradley T.; Molthan, Andrew
2017-01-01
The NASA Unified-Weather Research and Forecasting model (NU-WRF) will be included for testing and evaluation in the forecast demonstration project (FDP) of the International Collaborative Experiment -PyeongChang 2018 Olympic and Paralympic (ICE-POP) Winter Games. An international array of radar and supporting ground based observations together with various forecast and now-cast models will be operational during ICE-POP. In conjunction with personnel from NASA's Goddard Space Flight Center, the NASA Short-term Prediction Research and Transition (SPoRT) Center is developing benchmark simulations for a real-time NU-WRF configuration to run during the FDP. ICE-POP observational datasets will be used to validate model simulations and investigate improved model physics and performance for prediction of snow events during the research phase (RDP) of the project The NU-WRF model simulations will also support NASA Global Precipitation Measurement (GPM) Mission ground-validation physical and direct validation activities in relation to verifying, testing and improving satellite-based snowfall retrieval algorithms over complex terrain.
Pc as Physics Computer for Lhc ?
NASA Astrophysics Data System (ADS)
Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.
In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.
Gururaj, Anupama E.; Chen, Xiaoling; Pournejati, Saeid; Alter, George; Hersh, William R.; Demner-Fushman, Dina; Ohno-Machado, Lucila
2017-01-01
Abstract The rapid proliferation of publicly available biomedical datasets has provided abundant resources that are potentially of value as a means to reproduce prior experiments, and to generate and explore novel hypotheses. However, there are a number of barriers to the re-use of such datasets, which are distributed across a broad array of dataset repositories, focusing on different data types and indexed using different terminologies. New methods are needed to enable biomedical researchers to locate datasets of interest within this rapidly expanding information ecosystem, and new resources are needed for the formal evaluation of these methods as they emerge. In this paper, we describe the design and generation of a benchmark for information retrieval of biomedical datasets, which was developed and used for the 2016 bioCADDIE Dataset Retrieval Challenge. In the tradition of the seminal Cranfield experiments, and as exemplified by the Text Retrieval Conference (TREC), this benchmark includes a corpus (biomedical datasets), a set of queries, and relevance judgments relating these queries to elements of the corpus. This paper describes the process through which each of these elements was derived, with a focus on those aspects that distinguish this benchmark from typical information retrieval reference sets. Specifically, we discuss the origin of our queries in the context of a larger collaborative effort, the biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium, and the distinguishing features of biomedical dataset retrieval as a task. The resulting benchmark set has been made publicly available to advance research in the area of biomedical dataset retrieval. Database URL: https://biocaddie.org/benchmark-data PMID:29220453
Physics at a 100 TeV pp Collider: Standard Model Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mangano, M. L.; Zanderighi, G.; Aguilar Saavedra, J. A.
This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.
A Combined Experimental and Computational Study on Selected Physical Properties of Aminosilicones
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, RJ; Genovese, SE; Farnum, RL
2014-01-29
A number of physical properties of aminosilicones have been determined experimentally and predicted computationally. It was found that COSMO-RS predicted the densities of the materials under study to within about 4% of the experimentally determined values. Vapor pressure measurements were performed, and all of the aminosilicones of interest were found to be significantly less volatile than the benchmark MEA material. COSMO-RS was reasonably accurate for predicting the vapor pressures for aminosilicones that were thermally stable. The heat capacities of all aminosilicones tested were between 2.0 and 2.3 J/(g.degrees C); again substantially lower than a benchmark 30% aqueous MEA solution. Surfacemore » energies for the aminosilicones were found to be 23.3-28.3 dyne/cm and were accurately predicted using the parachor method.« less
JENDL-4.0/HE Benchmark Test with Concrete and Iron Shielding Experiments at JAEA/TIARA
NASA Astrophysics Data System (ADS)
Konno, Chikara; Matsuda, Norihiro; Kwon, Saerom; Ohta, Masayuki; Sato, Satoshi
2017-09-01
As a benchmark test of JENDL-4.0/HE released in 2015, we have analyzed the concrete and iron shielding experiments with the quasi mono-energetic 40 and 65 MeV neutron sources at TIARA in JAEA by using MCNP5 and ACE files processed from JENDL-4.0/HE with NJOY2012. As a result, it was found out that the calculation results with JENDL-4.0/HE agreed with the measured ones in the concrete experiment well, while they underestimated the measured ones in the iron experiment with 65 MeV neutrons more for the thicker assemblies. We examined the 56Fe data of JENDL-4.0/HE in detail and it was considered that the larger non-elastic scattering cross sections of 56Fe caused the underestimation in the calculation with JENDL-4.0/HE for the iron experiment with 65 MeV neutrons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.
2009-08-07
This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Barbara H. Dolphin; James W. Sterbentz
2013-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » Four benchmark experiments were evaluated in this report: Cores 1, 1A, 2, and 3. These core configurations represent the hexagonal close packing (HCP) configurations of the HTR-PROTEUS experiment with a moderator-to-fuel pebble ratio of 1:2. Core 1 represents the only configuration utilizing ZEBRA control rods. Cores 1A, 2, and 3 use withdrawable, hollow, stainless steel control rods. Cores 1 and 1A are similar except for the use of different control rods; Core 1A also has one less layer of pebbles (21 layers instead of 22). Core 2 retains the first 16 layers of pebbles from Cores 1 and 1A and has 16 layers of moderator pebbles stacked above the fueled layers. Core 3 retains the first 17 layers of pebbles but has polyethylene rods inserted between pebbles to simulate water ingress. The additional partial pebble layer (layer 18) for Core 3 was not included as it was used for core operations and not the reported critical configuration. Cores 1, 1A, 2, and 3 were determined to be acceptable benchmark experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Barbara H. Dolphin; James W. Sterbentz
2012-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » Four benchmark experiments were evaluated in this report: Cores 1, 1A, 2, and 3. These core configurations represent the hexagonal close packing (HCP) configurations of the HTR-PROTEUS experiment with a moderator-to-fuel pebble ratio of 1:2. Core 1 represents the only configuration utilizing ZEBRA control rods. Cores 1A, 2, and 3 use withdrawable, hollow, stainless steel control rods. Cores 1 and 1A are similar except for the use of different control rods; Core 1A also has one less layer of pebbles (21 layers instead of 22). Core 2 retains the first 16 layers of pebbles from Cores 1 and 1A and has 16 layers of moderator pebbles stacked above the fueled layers. Core 3 retains the first 17 layers of pebbles but has polyethylene rods inserted between pebbles to simulate water ingress. The additional partial pebble layer (layer 18) for Core 3 was not included as it was used for core operations and not the reported critical configuration. Cores 1, 1A, 2, and 3 were determined to be acceptable benchmark experiments.« less
Nonparametric estimation of benchmark doses in environmental risk assessment
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133
HyspIRI Low Latency Concept and Benchmarks
NASA Technical Reports Server (NTRS)
Mandl, Dan
2010-01-01
Topics include HyspIRI low latency data ops concept, HyspIRI data flow, ongoing efforts, experiment with Web Coverage Processing Service (WCPS) approach to injecting new algorithms into SensorWeb, low fidelity HyspIRI IPM testbed, compute cloud testbed, open cloud testbed environment, Global Lambda Integrated Facility (GLIF) and OCC collaboration with Starlight, delay tolerant network (DTN) protocol benchmarking, and EO-1 configuration for preliminary DTN prototype.
RNA-seq mixology: designing realistic control experiments to compare protocols and analysis methods
Holik, Aliaksei Z.; Law, Charity W.; Liu, Ruijie; Wang, Zeya; Wang, Wenyi; Ahn, Jaeil; Asselin-Labat, Marie-Liesse; Smyth, Gordon K.
2017-01-01
Abstract Carefully designed control experiments provide a gold standard for benchmarking different genomics research tools. A shortcoming of many gene expression control studies is that replication involves profiling the same reference RNA sample multiple times. This leads to low, pure technical noise that is atypical of regular studies. To achieve a more realistic noise structure, we generated a RNA-sequencing mixture experiment using two cell lines of the same cancer type. Variability was added by extracting RNA from independent cell cultures and degrading particular samples. The systematic gene expression changes induced by this design allowed benchmarking of different library preparation kits (standard poly-A versus total RNA with Ribozero depletion) and analysis pipelines. Data generated using the total RNA kit had more signal for introns and various RNA classes (ncRNA, snRNA, snoRNA) and less variability after degradation. For differential expression analysis, voom with quality weights marginally outperformed other popular methods, while for differential splicing, DEXSeq was simultaneously the most sensitive and the most inconsistent method. For sample deconvolution analysis, DeMix outperformed IsoPure convincingly. Our RNA-sequencing data set provides a valuable resource for benchmarking different protocols and data pre-processing workflows. The extra noise mimics routine lab experiments more closely, ensuring any conclusions are widely applicable. PMID:27899618
featsel: A framework for benchmarking of feature selection algorithms and cost functions
NASA Astrophysics Data System (ADS)
Reis, Marcelo S.; Estrela, Gustavo; Ferreira, Carlos Eduardo; Barrera, Junior
In this paper, we introduce featsel, a framework for benchmarking of feature selection algorithms and cost functions. This framework allows the user to deal with the search space as a Boolean lattice and has its core coded in C++ for computational efficiency purposes. Moreover, featsel includes Perl scripts to add new algorithms and/or cost functions, generate random instances, plot graphs and organize results into tables. Besides, this framework already comes with dozens of algorithms and cost functions for benchmarking experiments. We also provide illustrative examples, in which featsel outperforms the popular Weka workbench in feature selection procedures on data sets from the UCI Machine Learning Repository.
BACT Simulation User Guide (Version 7.0)
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1997-01-01
This report documents the structure and operation of a simulation model of the Benchmark Active Control Technology (BACT) Wind-Tunnel Model. The BACT system was designed, built, and tested at NASA Langley Research Center as part of the Benchmark Models Program and was developed to perform wind-tunnel experiments to obtain benchmark quality data to validate computational fluid dynamics and computational aeroelasticity codes, to verify the accuracy of current aeroservoelasticity design and analysis tools, and to provide an active controls testbed for evaluating new and innovative control algorithms for flutter suppression and gust load alleviation. The BACT system has been especially valuable as a control system testbed.
ICSBEP Benchmarks For Nuclear Data Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, J. Blair
2005-05-24
The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) -- Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive andmore » internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled ''International Handbook of Evaluated Criticality Safety Benchmark Experiments.'' The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hell, Natalie
K-shell transitions in astrophysically abundant metals and L-shell transitions in Fe group elements show characteristic signatures in the soft X-ray spectrum in the energy range 0.1–10 keV. These signatures have great diagnostic value for plasma parameters such as electron and ion temperatures and densities, and can thus help understand the physics controlling the energetic processes in astrophysical sources. This diagnostic power increases with advances in spectral resolution and effective area of the employed X-ray observatories. However, to make optimal use of the diagnostic potential – whether through global spectral modeling or through diagnostics from local modeling of individual lines –more » the underlying atomic physics has to be complete and well known. With the next generation of soft X-ray observatories featuring micro-calorimeters such as the SXS on Astro- H/Hitomi and the X-IFU on Athena, broadband high-resolution spectroscopy with large effective area will become more commonly available in the next decade. With these spectrometers, the accuracy of the plasma parameters derived from spectral modeling will be limited by the uncertainty of the reference atomic data rather than by instrumental factors, as is sometimes already the case for the high-resolution grating observations with Chandra-HETG and XMM-Newton-RGS. To take full advantage of the measured spectra, assessment of the accuracy of and improvements to the available atomic reference data are therefore important. Dedicated measurements in the laboratory are essential to benchmark the theoretical calculations providing the bulk of the reference data used in astrophysics. Experiments at the Lawrence Livermore National Laboratory electron beam ion traps (EBIT-I and SuperEBIT) have a long history of providing this service. In this work, I present new measurements of transition energies and absolute electron impact excitation cross sections geared towards currently open atomic physics data needs.« less
A suite of exercises for verifying dynamic earthquake rupture codes
Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis
2018-01-01
We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.
Dynamical sensitivity control of a single-spin quantum sensor.
Lazariev, Andrii; Arroyo-Camejo, Silvia; Rahane, Ganesh; Kavatamane, Vinaya Kumar; Balasubramanian, Gopalakrishnan
2017-07-26
The Nitrogen-Vacancy (NV) defect in diamond is a unique quantum system that offers precision sensing of nanoscale physical quantities at room temperature beyond the current state-of-the-art. The benchmark parameters for nanoscale magnetometry applications are sensitivity, spectral resolution, and dynamic range. Under realistic conditions the NV sensors controlled by conventional sensing schemes suffer from limitations of these parameters. Here we experimentally show a new method called dynamical sensitivity control (DYSCO) that boost the benchmark parameters and thus extends the practical applicability of the NV spin for nanoscale sensing. In contrast to conventional dynamical decoupling schemes, where π pulse trains toggle the spin precession abruptly, the DYSCO method allows for a smooth, analog modulation of the quantum probe's sensitivity. Our method decouples frequency selectivity and spectral resolution unconstrained over the bandwidth (1.85 MHz-392 Hz in our experiments). Using DYSCO we demonstrate high-accuracy NV magnetometry without |2π| ambiguities, an enhancement of the dynamic range by a factor of 4 · 10 3 , and interrogation times exceeding 2 ms in off-the-shelf diamond. In a broader perspective the DYSCO method provides a handle on the inherent dynamics of quantum systems offering decisive advantages for NV centre based applications notably in quantum information and single molecule NMR/MRI.
JASMIN: Japanese-American study of muon interactions and neutron detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.
Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of amore » code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.« less
Mandic, D. P.; Ryan, K.; Basu, B.; Pakrashi, V.
2016-01-01
Although vibration monitoring is a popular method to monitor and assess dynamic structures, quantification of linearity or nonlinearity of the dynamic responses remains a challenging problem. We investigate the delay vector variance (DVV) method in this regard in a comprehensive manner to establish the degree to which a change in signal nonlinearity can be related to system nonlinearity and how a change in system parameters affects the nonlinearity in the dynamic response of the system. A wide range of theoretical situations are considered in this regard using a single degree of freedom (SDOF) system to obtain numerical benchmarks. A number of experiments are then carried out using a physical SDOF model in the laboratory. Finally, a composite wind turbine blade is tested for different excitations and the dynamic responses are measured at a number of points to extend the investigation to continuum structures. The dynamic responses were measured using accelerometers, strain gauges and a Laser Doppler vibrometer. This comprehensive study creates a numerical and experimental benchmark for structurally dynamical systems where output-only information is typically available, especially in the context of DVV. The study also allows for comparative analysis between different systems driven by the similar input. PMID:26909175
The randomized benchmarking number is not what you think it is
NASA Astrophysics Data System (ADS)
Proctor, Timothy; Rudinger, Kenneth; Blume-Kohout, Robin; Sarovar, Mohan; Young, Kevin
Randomized benchmarking (RB) is a widely used technique for characterizing a gate set, whereby random sequences of gates are used to probe the average behavior of the gate set. The gates are chosen to ideally compose to the identity, and the rate of decay in the survival probability of an initial state with increasing length sequences is extracted from a set of experiments - this is the `RB number'. For reasonably well-behaved noise and particular gate sets, it has been claimed that the RB number is a reliable estimate of the average gate fidelity (AGF) of each noisy gate to the ideal target unitary, averaged over all gates in the set. Contrary to this widely held view, we show that this is not the case. We show that there are physically relevant situations, in which RB was thought to be provably reliable, where the RB number is many orders of magnitude away from the AGF. These results have important implications for interpreting the RB protocol, and immediate consequences for many advanced RB techniques. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Phase field benchmark problems for dendritic growth and linear elasticity
Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.; ...
2018-03-26
We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less
Phase field benchmark problems for dendritic growth and linear elasticity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.
We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less
How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.
2015-03-01
The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.
Benchmarking: a method for continuous quality improvement in health.
Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe
2012-05-01
Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical-social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted.
Benchmarking of venous thromboembolism prophylaxis practice with ENT.UK guidelines.
Al-Qahtani, Ali S
2017-05-01
The aim of this study was to benchmark our guidelines of prevention of venous thromboembolism (VTE) in ENT surgical population against ENT.UK guidelines, and also to encourage healthcare providers to utilize benchmarking as an effective method of improving performance. The study design is prospective descriptive analysis. The setting of this study is tertiary referral centre (Assir Central Hospital, Abha, Saudi Arabia). In this study, we are benchmarking our practice guidelines of the prevention of VTE in the ENT surgical population against that of ENT.UK guidelines to mitigate any gaps. ENT guidelines 2010 were downloaded from the ENT.UK Website. Our guidelines were compared with the possibilities that either our performance meets or fall short of ENT.UK guidelines. Immediate corrective actions will take place if there is quality chasm between the two guidelines. ENT.UK guidelines are evidence-based and updated which may serve as role-model for adoption and benchmarking. Our guidelines were accordingly amended to contain all factors required in providing a quality service to ENT surgical patients. While not given appropriate attention, benchmarking is a useful tool in improving quality of health care. It allows learning from others' practices and experiences, and works towards closing any quality gaps. In addition, benchmarking clinical outcomes is critical for quality improvement and informing decisions concerning service provision. It is recommended to be included on the list of quality improvement methods of healthcare services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiner, Miles
Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding ismore » likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.« less
Grid Computing Environment using a Beowulf Cluster
NASA Astrophysics Data System (ADS)
Alanis, Fransisco; Mahmood, Akhtar
2003-10-01
Custom-made Beowulf clusters using PCs are currently replacing expensive supercomputers to carry out complex scientific computations. At the University of Texas - Pan American, we built a 8 Gflops Beowulf Cluster for doing HEP research using RedHat Linux 7.3 and the LAM-MPI middleware. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes that were compiled in C on the cluster using the LAM-XMPI graphics user environment. We will demonstrate a "simple" prototype grid environment, where we will submit and run parallel jobs remotely across multiple cluster nodes over the internet from the presentation room at Texas Tech. University. The Sphinx Beowulf Cluster will be used for monte-carlo grid test-bed studies for the LHC-ATLAS high energy physics experiment. Grid is a new IT concept for the next generation of the "Super Internet" for high-performance computing. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.
2D imaging of helium ion velocity in the DIII-D divertor
NASA Astrophysics Data System (ADS)
Samuell, C. M.; Porter, G. D.; Meyer, W. H.; Rognlien, T. D.; Allen, S. L.; Briesemeister, A.; Mclean, A. G.; Zeng, L.; Jaervinen, A. E.; Howard, J.
2018-05-01
Two-dimensional imaging of parallel ion velocities is compared to fluid modeling simulations to understand the role of ions in determining divertor conditions and benchmark the UEDGE fluid modeling code. Pure helium discharges are used so that spectroscopic He+ measurements represent the main-ion population at small electron temperatures. Electron temperatures and densities in the divertor match simulated values to within about 20%-30%, establishing the experiment/model match as being at least as good as those normally obtained in the more regularly simulated deuterium plasmas. He+ brightness (HeII) comparison indicates that the degree of detachment is captured well by UEDGE, principally due to the inclusion of E ×B drifts. Tomographically inverted Coherence Imaging Spectroscopy measurements are used to determine the He+ parallel velocities which display excellent agreement between the model and the experiment near the divertor target where He+ is predicted to be the main-ion species and where electron-dominated physics dictates the parallel momentum balance. Upstream near the X-point where He+ is a minority species and ion-dominated physics plays a more important role, there is an underestimation of the flow velocity magnitude by a factor of 2-3. These results indicate that more effort is required to be able to correctly predict ion momentum in these challenging regimes.
NASA Astrophysics Data System (ADS)
Lagus, Todd P.; Edd, Jon F.
2013-03-01
Most cell biology experiments are performed in bulk cell suspensions where cell secretions become diluted and mixed in a contiguous sample. Confinement of single cells to small, picoliter-sized droplets within a continuous phase of oil provides chemical isolation of each cell, creating individual microreactors where rare cell qualities are highlighted and otherwise undetectable signals can be concentrated to measurable levels. Recent work in microfluidics has yielded methods for the encapsulation of cells in aqueous droplets and hydrogels at kilohertz rates, creating the potential for millions of parallel single-cell experiments. However, commercial applications of high-throughput microdroplet generation and downstream sensing and actuation methods are still emerging for cells. Using fluorescence-activated cell sorting (FACS) as a benchmark for commercially available high-throughput screening, this focused review discusses the fluid physics of droplet formation, methods for cell encapsulation in liquids and hydrogels, sensors and actuators and notable biological applications of high-throughput single-cell droplet microfluidics.
Study of molecular carbon-hydrogen bond dissociation during shock compression
NASA Astrophysics Data System (ADS)
Hammel, Ben; Hawreliak, James
2017-06-01
Advancements in theory and experiment show that chemical interactions in warm dense mixtures play a non-negligible role in the high-temperature and high-pressure properties of a molecular compound. For example, recent work on polystyrene has observed features suggestive of molecular dissociation - non-linear ``kinks'' are evident in the material's Hugoniot, consistent with CH bond breaking. The assumption used in linear mixing models, that species are chemically inert, breaks down in warm dense mixtures. At the Institute for Shock Physics, we are developing the necessary capabilities to perform high-repetition-rate experiments needed to map out chemical-reaction features along a material's Hugoniot. Initially, we plan to benchmark our work to the data taken by Barrios et al., by reproducing the observed kink in the polystyrene Hugoniot. We then extend this capability to explore polypropylene, CH2, where we expect to observe multiple kink features - representative of the disassociation of multiple CH bonds. Work supported by DOE/NNSA, DOE/SC-OFES and Murdock Charitable Trust.
Numerical Analysis of Base Flowfield for a Four-Engine Clustered Nozzle Configuration
NASA Technical Reports Server (NTRS)
Wang, Ten-See
1995-01-01
Excessive base heating has been a problem for many launch vehicles. For certain designs such as the direct dump of turbine exhaust inside and at the lip of the nozzle, the potential burning of the turbine exhaust in the base region can be of great concern. Accurate prediction of the base environment at altitudes is therefore very important during the vehicle design phase. Otherwise, undesirable consequences may occur. In this study, the turbulent base flowfield of a cold flow experimental investigation for a four-engine clustered nozzle was numerically benchmarked using a pressure-based computational fluid dynamics (CFD) method. This is a necessary step before the benchmarking of hot flow and combustion flow tests can be considered. Since the medium was unheated air, reasonable prediction of the base pressure distribution at high altitude was the main goal. Several physical phenomena pertaining to the multiengine clustered nozzle base flow physics were deduced from the analysis.
Outage management and health physics issue, 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agnihotri, Newal
2009-05-15
The focus of the May-June issue is on outage management and health physics. Major articles include the following: Planning and scheduling to minimize refueling outage, by Pat McKenna, AmerenUE; Prioritizing safety, quality and schedule, by Tom Sharkey, Dominion; Benchmarking to high standards, by Margie Jepson, Energy Nuclear; Benchmarking against U.S. standards, by Magnox North, United Kingdom; Enabling suppliers for new build activity, by Marcus Harrington, GE Hitachi Nuclear Energy; Identifying, cultivating and qualifying suppliers, by Thomas E. Silva, AREVA NP; Creating new U.S. jobs, by Francois Martineau, Areva NP. Industry innovation articles include: MSL Acoustic source load reduction, by Amirmore » Shahkarami, Exelon Nuclear; Dual Methodology NDE of CRDM nozzles, by Michael Stark, Dominion Nuclear; and Electronic circuit board testing, by James Amundsen, FirstEnergy Nuclear Operating Company. The plant profile article is titled The future is now, by Julia Milstead, Progress Energy Service Company, LLC.« less
On coupling fluid plasma and kinetic neutral physics models
Joseph, I.; Rensink, M. E.; Stotler, D. P.; ...
2017-03-01
The coupled fluid plasma and kinetic neutral physics equations are analyzed through theory and simulation of benchmark cases. It is shown that coupling methods that do not treat the coupling rates implicitly are restricted to short time steps for stability. Fast charge exchange, ionization and recombination coupling rates exist, even after constraining the solution by requiring that the neutrals are at equilibrium. For explicit coupling, the present implementation of Monte Carlo correlated sampling techniques does not allow for complete convergence in slab geometry. For the benchmark case, residuals decay with particle number and increase with grid size, indicating that theymore » scale in a manner that is similar to the theoretical prediction for nonlinear bias error. Progress is reported on implementation of a fully implicit Jacobian-free Newton–Krylov coupling scheme. The present block Jacobi preconditioning method is still sensitive to time step and methods that better precondition the coupled system are under investigation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; Dunn, Michael E
In October 2010, a series of benchmark experiments were conducted at the French Commissariat a l'Energie Atomique et aux Energies Alternatives (CEA) Valduc SILENE facility. These experiments were a joint effort between the United States Department of Energy Nuclear Criticality Safety Program and the CEA. The purpose of these experiments was to create three benchmarks for the verification and validation of radiation transport codes and evaluated nuclear data used in the analysis of criticality accident alarm systems. This series of experiments consisted of three single-pulsed experiments with the SILENE reactor. For the first experiment, the reactor was bare (unshielded), whereasmore » in the second and third experiments, it was shielded by lead and polyethylene, respectively. The polyethylene shield of the third experiment had a cadmium liner on its internal and external surfaces, which vertically was located near the fuel region of SILENE. During each experiment, several neutron activation foils and thermoluminescent dosimeters (TLDs) were placed around the reactor. Nearly half of the foils and TLDs had additional high-density magnetite concrete, high-density barite concrete, standard concrete, and/or BoroBond shields. CEA Saclay provided all the concrete, and the US Y-12 National Security Complex provided the BoroBond. Measurement data from the experiments were published at the 2011 International Conference on Nuclear Criticality (ICNC 2011) and the 2013 Nuclear Criticality Safety Division (NCSD 2013) topical meeting. Preliminary computational results for the first experiment were presented in the ICNC 2011 paper, which showed poor agreement between the computational results and the measured values of the foils shielded by concrete. Recently the hydrogen content, boron content, and density of these concrete shields were further investigated within the constraints of the previously available data. New computational results for the first experiment are now available that show much better agreement with the measured values.« less
Health Physics Innovations Developed During Cassini for Future Space Applications
NASA Technical Reports Server (NTRS)
Nickell, Rodney E.; Rutherford, Theresa M.; Marmaro, George M.
1999-01-01
The long history of space flight includes missions that used Space Nuclear Auxiliary Power devices, starting with the Transit 4A Spacecraft (1961), continuing through the Apollo, Pioneer, Viking, Voyager, Galileo, Ulysses, Mars Pathfinder, and most recently, Cassini (1997). All Major Radiological Source (MRS) missions were processed at Kennedy Space Center/Cape Canaveral Air Station (KSC/CCAS) Launch Site in full compliance with program and regulatory requirements. The cumulative experience gained supporting these past missions has led to significant innovations which will be useful for benchmarking future MRS mission ground processing. Innovations developed during ground support for the Cassini mission include official declaration of sealed-source classifications, utilization of a mobile analytical laboratory, employment of a computerized dosimetry record management system, and cross-utilization of personnel from related disciplines.
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
Flow reversal power limit for the HFBR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Lap Y.; Tichler, P.R.
The High Flux Beam Reactor (HFBR) undergoes a buoyancy-driven reversal of flow in the reactor core following certain postulated accidents. Uncertainties about the afterheat removal capability during the flow reversal has limited the reactor operating power to 30 MW. An experimental and analytical program to address these uncertainties is described in this report. The experiments were single channel flow reversal tests under a range of conditions. The analytical phase involved simulations of the tests to benchmark the physical models and development of a criterion for dryout. The criterion is then used in simulations of reactor accidents to determine a safemore » operating power level. It is concluded that the limit on the HFBR operating power with respect to the issue of flow reversal is in excess of 60 MW.« less
NASA Astrophysics Data System (ADS)
Bijeljic, Branko; Icardi, Matteo; Prodanović, Maša
2018-05-01
Substantial progress has been made over last few decades on understanding the physics of multiphase flow and reactive transport phenomena in subsurface porous media. Confluence of advances in experimental techniques (including micromodels, X-ray microtomography, Nuclear Magnetic Resonance (NMR)) as well as computational power have made it possible to observe static and dynamic multi-scale flow, transport and reactive processes, thus stimulating development of new generation of modelling tools from pore to field scale. One of the key challenges is to make experiment and models as complementary as possible, with continuously improving experimental methods in order to increase predictive capabilities of theoretical models across scales. This creates need to establish rigorous benchmark studies of flow, transport and reaction in porous media which can then serve as the basis for introducing more complex phenomena in future developments.
MPI_XSTAR: MPI-based Parallelization of the XSTAR Photoionization Program
NASA Astrophysics Data System (ADS)
Danehkar, Ashkbiz; Nowak, Michael A.; Lee, Julia C.; Smith, Randall K.
2018-02-01
We describe a program for the parallel implementation of multiple runs of XSTAR, a photoionization code that is used to predict the physical properties of an ionized gas from its emission and/or absorption lines. The parallelization program, called MPI_XSTAR, has been developed and implemented in the C++ language by using the Message Passing Interface (MPI) protocol, a conventional standard of parallel computing. We have benchmarked parallel multiprocessing executions of XSTAR, using MPI_XSTAR, against a serial execution of XSTAR, in terms of the parallelization speedup and the computing resource efficiency. Our experience indicates that the parallel execution runs significantly faster than the serial execution, however, the efficiency in terms of the computing resource usage decreases with increasing the number of processors used in the parallel computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, J; Dossa, D; Gokhale, M
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe:more » (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows: SuperMicro X7DBE Xeon Dual Socket Blackford Server Motherboard; 2 Intel Xeon Dual-Core 2.66 GHz processors; 1 GB DDR2 PC2-5300 RAM (2 x 512); 80GB Hard Drive (Seagate SATA II Barracuda). The Fusion board is presently capable of 4X in a PCIe slot. The image resampling benchmark was run on a dual Xeon workstation with NVIDIA graphics card (see Chapter 5 for full specification). An XtremeData Opteron+FPGA was used for the language classification application. We observed that these benchmarks are not uniformly I/O intensive. The only benchmark that showed greater that 50% of the time in I/O was the graph algorithm when it accessed data files over NFS. When local disk was used, the graph benchmark spent at most 40% of its time in I/O. The other benchmarks were CPU dominated. The image resampling benchmark and language classification showed order of magnitude speedup over software by using co-processor technology to offload the CPU-intensive kernels. Our experiments to date suggest that emerging hardware technologies offer significant benefit to boosting the performance of data-intensive algorithms. Using GPU and FPGA co-processors, we were able to improve performance by more than an order of magnitude on the benchmark algorithms, eliminating the processor bottleneck of CPU-bound tasks. Experiments with a prototype solid state nonvolative memory available today show 10X better throughput on random reads than disk, with a 2X speedup on a graph processing benchmark when compared to the use of local SATA disk.« less
NASA Astrophysics Data System (ADS)
Everett, Samantha
2010-10-01
A transmission curve experiment was carried out to measure the range of beta particles in aluminum in the health physics laboratory located on the campus of Texas Southern University. The transmission count rate through aluminum for varying radiation lengths was measured using beta particles emitted from a low activity (˜1 μCi) Sr-90 source. The count rate intensity was recorded using a Geiger Mueller tube (SGC N210/BNC) with an active volume of 61 cm^3 within a systematic detection accuracy of a few percent. We compared these data with a realistic simulation of the experimental setup using the Geant4 Monte Carlo toolkit (version 9.3). The purpose of this study was to benchmark our Monte Carlo for future experiments as part of a more comprehensive research program. Transmission curves were simulated based on the standard and low-energy electromagnetic physics models, and using the radioactive decay module for the electrons primary energy distribution. To ensure the validity of our measurements, linear extrapolation techniques were employed to determine the in-medium beta particle range from the measured data and was found to be 1.87 g/cm^2 (˜0.693 cm), in agreement with literature values. We found that the general shape of the measured data and simulated curves were comparable; however, a discrepancy in the relative count rates was observed. The origin of this disagreement is still under investigation.
Benchmarking image fusion system design parameters
NASA Astrophysics Data System (ADS)
Howell, Christopher L.
2013-06-01
A clear and absolute method for discriminating between image fusion algorithm performances is presented. This method can effectively be used to assist in the design and modeling of image fusion systems. Specifically, it is postulated that quantifying human task performance using image fusion should be benchmarked to whether the fusion algorithm, at a minimum, retained the performance benefit achievable by each independent spectral band being fused. The established benchmark would then clearly represent the threshold that a fusion system should surpass to be considered beneficial to a particular task. A genetic algorithm is employed to characterize the fused system parameters using a Matlab® implementation of NVThermIP as the objective function. By setting the problem up as a mixed-integer constraint optimization problem, one can effectively look backwards through the image acquisition process: optimizing fused system parameters by minimizing the difference between modeled task difficulty measure and the benchmark task difficulty measure. The results of an identification perception experiment are presented, where human observers were asked to identify a standard set of military targets, and used to demonstrate the effectiveness of the benchmarking process.
Importance of inlet boundary conditions for numerical simulation of combustor flows
NASA Technical Reports Server (NTRS)
Sturgess, G. J.; Syed, S. A.; Mcmanus, K. R.
1983-01-01
Fluid dynamic computer codes for the mathematical simulation of problems in gas turbine engine combustion systems are required as design and diagnostic tools. To eventually achieve a performance standard with these codes of more than qualitative accuracy it is desirable to use benchmark experiments for validation studies. Typical of the fluid dynamic computer codes being developed for combustor simulations is the TEACH (Teaching Elliptic Axisymmetric Characteristics Heuristically) solution procedure. It is difficult to find suitable experiments which satisfy the present definition of benchmark quality. For the majority of the available experiments there is a lack of information concerning the boundary conditions. A standard TEACH-type numerical technique is applied to a number of test-case experiments. It is found that numerical simulations of gas turbine combustor-relevant flows can be sensitive to the plane at which the calculations start and the spatial distributions of inlet quantities for swirling flows.
The challenges of numerically simulating analogue brittle thrust wedges
NASA Astrophysics Data System (ADS)
Buiter, Susanne; Ellis, Susan
2017-04-01
Fold-and-thrust belts and accretionary wedges form when sedimentary and crustal rocks are compressed into thrusts and folds in the foreland of an orogen or at a subduction trench. For over a century, analogue models have been used to investigate the deformation characteristics of such brittle wedges. These models predict wedge shapes that agree with analytical critical taper theory and internal deformation structures that well resemble natural observations. In a series of comparison experiments for thrust wedges, called the GeoMod2004 (1,2) and GeoMod2008 (3,4) experiments, it was shown that different numerical solution methods successfully reproduce sandbox thrust wedges. However, the GeoMod2008 benchmark also pointed to the difficulties of representing frictional boundary conditions and sharp velocity discontinuities with continuum numerical methods, in addition to the well-known challenges of numerical plasticity. Here we show how details in the numerical implementation of boundary conditions can substantially impact numerical wedge deformation. We consider experiment 1 of the GeoMod2008 brittle thrust wedge benchmarks. This experiment examines a triangular thrust wedge in the stable field of critical taper theory that should remain stable, that is, without internal deformation, when sliding over a basal frictional surface. The thrust wedge is translated by lateral displacement of a rigid mobile wall. The corner between the mobile wall and the subsurface is a velocity discontinuity. Using our finite-element code SULEC, we show how different approaches to implementing boundary friction (boundary layer or contact elements) and the velocity discontinuity (various smoothing schemes) can cause the wedge to indeed translate in a stable manner or to undergo internal deformation (which is a fail). We recommend that numerical studies of sandbox setups not only report the details of their implementation of boundary conditions, but also document the modelling attempts that failed. References 1. Buiter and the GeoMod2004 Team, 2006. The numerical sandbox: comparison of model results for a shortening and an extension experiment. Geol. Soc. Lond. Spec. Publ. 253, 29-64 2. Schreurs and the GeoMod2004 Team, 2006. Analogue benchmarks of shortening and extension experiments. Geol. Soc. Lond. Spec. Publ. 253, 1-27 3. Buiter, Schreurs and the GeoMod2008 Team, 2016. Benchmarking numerical models of brittle thrust wedges, J. Struct. Geol. 92, 140-177 4. Schreurs, Buiter and the GeoMod2008 Team, 2016. Benchmarking analogue models of brittle thrust wedges, J. Struct. Geol. 92, 116-13
2D Quantum Simulation of MOSFET Using the Non Equilibrium Green's Function Method
NASA Technical Reports Server (NTRS)
Svizhenko, Alexel; Anantram, M. P.; Govindan, T. R.; Yan, Jerry (Technical Monitor)
2000-01-01
The objectives this viewgraph presentation summarizes include: (1) the development of a quantum mechanical simulator for ultra short channel MOSFET simulation, including theory, physical approximations, and computer code; (2) explore physics that is not accessible by semiclassical methods; (3) benchmarking of semiclassical and classical methods; and (4) study other two-dimensional devices and molecular structure, from discretized Hamiltonian to tight-binding Hamiltonian.
ERIC Educational Resources Information Center
Bodner, Michael E.; Rhodes, Ryan E.; Miller, William C.; Dean, Elizabeth
2013-01-01
Health promotion (HP) warrants being a clinical competency for health professionals given the global burden of lifestyle-related conditions; these are largely preventable with lifestyle behavior change. Physical therapists have a practice pattern conducive to HP, including lifestyle behavior change. The extent to which HP content is included in…
Benchmarking of Neutron Production of Heavy-Ion Transport Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence
Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less
Benchmarking of Heavy Ion Transport Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence
Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less
Electron Production and Collective Field Generation in Intense Particle Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molvik, A W; Vay, J; Cohen, R
Electron cloud effects (ECEs) are increasingly recognized as important, but incompletely understood, dynamical phenomena, which can severely limit the performance of present electron colliders, the next generation of high-intensity rings, such as PEP-II upgrade, LHC, and the SNS, the SIS 100/200, or future high-intensity heavy ion accelerators such as envisioned in Heavy Ion Inertial Fusion (HIF). Deleterious effects include ion-electron instabilities, emittance growth, particle loss, increase in vacuum pressure, added heat load at the vacuum chamber walls, and interference with certain beam diagnostics. Extrapolation of present experience to significantly higher beam intensities is uncertain given the present level of understanding.more » With coordinated LDRD projects at LLNL and LBNL, we undertook a comprehensive R&D program including experiments, theory and simulations to better understand the phenomena, establish the essential parameters, and develop mitigating mechanisms. This LDRD project laid the essential groundwork for such a program. We developed insights into the essential processes, modeled the relevant physics, and implemented these models in computational production tools that can be used for self-consistent study of the effect on ion beams. We validated the models and tools through comparison with experimental data, including data from new diagnostics that we developed as part of this work and validated on the High-Current Experiment (HCX) at LBNL. We applied these models to High-Energy Physics (HEP) and other advanced accelerators. This project was highly successful, as evidenced by the two paragraphs above, and six paragraphs following that are taken from our 2003 proposal with minor editing that mostly consisted of changing the tense. Further benchmarks of outstanding performance are: we had 13 publications with 8 of them in refereed journals, our work was recognized by the accelerator and plasma physics communities by 8 invited papers and we have 5 additional invitations for invited papers at upcoming conferences, we attracted collaborators who had SBIR funding, we are collaborating with scientists at CERN and GSI Darmstadt on gas desorption physics for submission to Physical Review Letters, and another PRL on absolute measurements of electron cloud density and Phys. Rev. ST-AB on electron emission physics are also being readied for submission.« less
Benchmarks for single-phase flow in fractured porous media
NASA Astrophysics Data System (ADS)
Flemisch, Bernd; Berre, Inga; Boon, Wietse; Fumagalli, Alessio; Schwenck, Nicolas; Scotti, Anna; Stefansson, Ivar; Tatomir, Alexandru
2018-01-01
This paper presents several test cases intended to be benchmarks for numerical schemes for single-phase fluid flow in fractured porous media. A number of solution strategies are compared, including a vertex and two cell-centred finite volume methods, a non-conforming embedded discrete fracture model, a primal and a dual extended finite element formulation, and a mortar discrete fracture model. The proposed benchmarks test the schemes by increasing the difficulties in terms of network geometry, e.g. intersecting fractures, and physical parameters, e.g. low and high fracture-matrix permeability ratio as well as heterogeneous fracture permeabilities. For each problem, the results presented are the number of unknowns, the approximation errors in the porous matrix and in the fractures with respect to a reference solution, and the sparsity and condition number of the discretized linear system. All data and meshes used in this study are publicly available for further comparisons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; McMahan, Kimberly L.
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 19, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. The CIDAS detects gammas with a Geiger-Muller tube and the Rocky Flats detects neutrons via charged particles produced in a thin 6LiF disc depositing energy in a Si solid state detector. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; Isbell, Kimberly McMahan
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 13, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. The CIDAS detects gammas with a Geiger-Muller tube, and the Rocky Flats detects neutrons via charged particles produced in a thin 6LiF disc, depositing energy in a Si solid-state detector. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
Commencing Student Experience: New Insights and Implications for Action
ERIC Educational Resources Information Center
Grebennikov, Leonid; Shah, Mahsood
2012-01-01
In many developed countries, including Australia, it is common practice to regularly survey university students in order to assess their experience inside and beyond the classroom. Governments conduct nationwide surveys to assess the quality of student experience, benchmark outcomes nationally and in some cases reward better performing…
Mergner, Thomas; Lippi, Vittorio
2018-01-01
Posture control is indispensable for both humans and humanoid robots, which becomes especially evident when performing sensorimotor tasks such as moving on compliant terrain or interacting with the environment. Posture control is therefore targeted in recent proposals of robot benchmarking in order to advance their development. This Methods article suggests corresponding robot tests of standing balance, drawing inspirations from the human sensorimotor system and presenting examples from robot experiments. To account for a considerable technical and algorithmic diversity among robots, we focus in our tests on basic posture control mechanisms, which provide humans with an impressive postural versatility and robustness. Specifically, we focus on the mechanically challenging balancing of the whole body above the feet in the sagittal plane around the ankle joints in concert with the upper body balancing around the hip joints. The suggested tests target three key issues of human balancing, which appear equally relevant for humanoid bipeds: (1) four basic physical disturbances (support surface (SS) tilt and translation, field and contact forces) may affect the balancing in any given degree of freedom (DoF). Targeting these disturbances allows us to abstract from the manifold of possible behavioral tasks. (2) Posture control interacts in a conflict-free way with the control of voluntary movements for undisturbed movement execution, both with "reactive" balancing of external disturbances and "proactive" balancing of self-produced disturbances from the voluntary movements. Our proposals therefore target both types of disturbances and their superposition. (3) Relevant for both versatility and robustness of the control, linkages between the posture control mechanisms across DoFs provide their functional cooperation and coordination at will and on functional demands. The suggested tests therefore include ankle-hip coordination. Suggested benchmarking criteria build on the evoked sway magnitude, normalized to robot weight and Center of mass (COM) height, in relation to reference ranges that remain to be established. The references may include human likeness features. The proposed benchmarking concept may in principle also be applied to wearable robots, where a human user may command movements, but may not be aware of the additionally required postural control, which then needs to be implemented into the robot.
Mergner, Thomas; Lippi, Vittorio
2018-01-01
Posture control is indispensable for both humans and humanoid robots, which becomes especially evident when performing sensorimotor tasks such as moving on compliant terrain or interacting with the environment. Posture control is therefore targeted in recent proposals of robot benchmarking in order to advance their development. This Methods article suggests corresponding robot tests of standing balance, drawing inspirations from the human sensorimotor system and presenting examples from robot experiments. To account for a considerable technical and algorithmic diversity among robots, we focus in our tests on basic posture control mechanisms, which provide humans with an impressive postural versatility and robustness. Specifically, we focus on the mechanically challenging balancing of the whole body above the feet in the sagittal plane around the ankle joints in concert with the upper body balancing around the hip joints. The suggested tests target three key issues of human balancing, which appear equally relevant for humanoid bipeds: (1) four basic physical disturbances (support surface (SS) tilt and translation, field and contact forces) may affect the balancing in any given degree of freedom (DoF). Targeting these disturbances allows us to abstract from the manifold of possible behavioral tasks. (2) Posture control interacts in a conflict-free way with the control of voluntary movements for undisturbed movement execution, both with “reactive” balancing of external disturbances and “proactive” balancing of self-produced disturbances from the voluntary movements. Our proposals therefore target both types of disturbances and their superposition. (3) Relevant for both versatility and robustness of the control, linkages between the posture control mechanisms across DoFs provide their functional cooperation and coordination at will and on functional demands. The suggested tests therefore include ankle-hip coordination. Suggested benchmarking criteria build on the evoked sway magnitude, normalized to robot weight and Center of mass (COM) height, in relation to reference ranges that remain to be established. The references may include human likeness features. The proposed benchmarking concept may in principle also be applied to wearable robots, where a human user may command movements, but may not be aware of the additionally required postural control, which then needs to be implemented into the robot. PMID:29867428
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilkenny, J.; Richau, G.; Sangster, C.
A major goal of the Stockpile Stewardship Program (SSP) is to deliver validated numerical models, benchmarked against experiments that address relevant and important issues and provide data that stress the codes and our understanding. DOENNSA has made significant investments in major facilities and high-performance computing to successfully execute the SSP. The more information obtained about the physical state of the plasmas produced, the more stringent the test of theories, models, and codes can be, leading to increased confidence in our predictive capability. To fully exploit the world-leading capabilities of the ICF program, a multi-year program to develop and deploy advancedmore » diagnostics has been developed by the expert scientific community. To formalize these activities NNSA’s Acting Director for the Inertial Confinement Fusion Program directed the formation and duties of the National Diagnostics Working Group (NDWG) in a Memorandum 11/3/16 (Appendix A). The NDWG identified eight transformational diagnostics, shown in Table 1, that will provide unprecedented information from experiments in support of the SSP at NIF, Z and OMEGA. Table 1 shows how the missions of the SSP experiments including materials, complex hydrodynamics, radiation flow and effects and thermo-nuclear burn and boost will produce new observables, which will be measured using a variety of largely new diagnostic technologies used in the eight transformational diagnostics. The data provided by these diagnostics will validate and improve the physics contained within the SSP’s simulations and both uncover and quantify important phenomena that lie beyond our present understanding.« less
Benchmarking: A Method for Continuous Quality Improvement in Health
Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe
2012-01-01
Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical–social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted. PMID:23634166
A quasi two-dimensional benchmark experiment for the solidification of a tin lead binary alloy
NASA Astrophysics Data System (ADS)
Wang, Xiao Dong; Petitpas, Patrick; Garnier, Christian; Paulin, Jean-Pierre; Fautrelle, Yves
2007-05-01
A horizontal solidification benchmark experiment with pure tin and a binary alloy of Sn-10 wt.%Pb is proposed. The experiment consists in solidifying a rectangular sample using two lateral heat exchangers which allow the application a controlled horizontal temperature difference. An array of fifty thermocouples placed on the lateral wall permits the determination of the instantaneous temperature distribution. The cases with the temperature gradient G=0, and the cooling rates equal to 0.02 and 0.04 K/s are studied. The time evolution of the interfacial total heat flux and the temperature field are recorded and analyzed. This allows us to evaluate heat transfer evolution due to natural convection, as well as its influence on the solidification macrostructure. To cite this article: X.D. Wang et al., C. R. Mecanique 335 (2007).
Comas, J; Rodríguez-Roda, I; Poch, M; Gernaey, K V; Rosen, C; Jeppsson, U
2006-01-01
Wastewater treatment plant operators encounter complex operational problems related to the activated sludge process and usually respond to these by applying their own intuition and by taking advantage of what they have learnt from past experiences of similar problems. However, previous process experiences are not easy to integrate in numerical control, and new tools must be developed to enable re-use of plant operating experience. The aim of this paper is to investigate the usefulness of a case-based reasoning (CBR) approach to apply learning and re-use of knowledge gained during past incidents to confront actual complex problems through the IWA/COST Benchmark protocol. A case study shows that the proposed CBR system achieves a significant improvement of the benchmark plant performance when facing a high-flow event disturbance.
Resonance Parameter Adjustment Based on Integral Experiments
Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...
2016-06-02
Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less
A benchmark study of the sea-level equation in GIA modelling
NASA Astrophysics Data System (ADS)
Martinec, Zdenek; Klemann, Volker; van der Wal, Wouter; Riva, Riccardo; Spada, Giorgio; Simon, Karen; Blank, Bas; Sun, Yu; Melini, Daniele; James, Tom; Bradley, Sarah
2017-04-01
The sea-level load in glacial isostatic adjustment (GIA) is described by the so called sea-level equation (SLE), which represents the mass redistribution between ice sheets and oceans on a deforming earth. Various levels of complexity of SLE have been proposed in the past, ranging from a simple mean global sea level (the so-called eustatic sea level) to the load with a deforming ocean bottom, migrating coastlines and a changing shape of the geoid. Several approaches to solve the SLE have been derived, from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, there has been no systematic intercomparison amongst the solvers through which the methods may be validated. The goal of this paper is to present a series of benchmark experiments designed for testing and comparing numerical implementations of the SLE. Our approach starts with simple load cases even though the benchmark will not result in GIA predictions for a realistic loading scenario. In the longer term we aim for a benchmark with a realistic loading scenario, and also for benchmark solutions with rotational feedback. The current benchmark uses an earth model for which Love numbers have been computed and benchmarked in Spada et al (2011). In spite of the significant differences in the numerical methods employed, the test computations performed so far show a satisfactory agreement between the results provided by the participants. The differences found can often be attributed to the different approximations inherent to the various algorithms. Literature G. Spada, V. R. Barletta, V. Klemann, R. E. M. Riva, Z. Martinec, P. Gasperini, B. Lund, D. Wolf, L. L. A. Vermeersen, and M. A. King, 2011. A benchmark study for glacial isostatic adjustment codes. Geophys. J. Int. 185: 106-132 doi:10.1111/j.1365-
Benchmarking of neutron production of heavy-ion transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remec, I.; Ronningen, R. M.; Heilbronn, L.
Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
NASA Technical Reports Server (NTRS)
Sen, Subhayu; Stefanescu, Doru M.; Catalina, A. V.; Juretzko, F.; Dhindaw, B. K.; Curreri, P. A.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
The interaction of an insoluble particle with a growing solid-liquid interface (SLI) has been a subject of investigation for the four decades. For a metallurgist or a material scientist understanding the fundamental physics of such an interaction is relevant for applications that include distribution of reinforcement particles in metal matrix composites, inclusion management in castings, and distribution of Y2Ba1Cu1O5 (211) precipitates (flux pinning sites) in Y1Ba2Cu3O7 (123) superconducting crystals. The same physics is also applicable to other areas including geological applications (frost heaving in soils) and preservation of biological cells. Experimentally this interaction can be quantified in terms of a critical growth velocity, Vcr, of the SLI below which particles are pushed ahead of the advancing interface, and above which the particles are engulfed. Past experimental evidence suggests that this Vcr is an inverse function of the particle radius, R. In order to isolate the fundamental physics that governs such a relationship it is necessary to minimize natural convection at the SLI that is inherent in ground based experiments. Hence for the purpose of producing benchmark data (Vcr vs. R) PEP is a natural candidate for micro-gravity experimentation. Accordingly, experiments with pure Al containing a dispersion of ZrO2 particles and an organic analogue, succinonitrile (SCN) containing polystyrene particles have been performed on the LMS and USMP-4 mission respectively. In this paper we will summarize the experimental data that was obtained during these two micro-gravity missions and show that the results differ compared to terrestrial experiments. We will also discuss the basic elements of our analytical and numerical model and present a comparison of the predictions of these models against micro-gravity experimental data. Finally. we will discuss our future experimental plan that includes the ISS glovebox and MSRRl.
Benchmarking worker nodes using LHCb productions and comparing with HEPSpec06
NASA Astrophysics Data System (ADS)
Charpentier, P.
2017-10-01
In order to estimate the capabilities of a computing slot with limited processing time, it is necessary to know with a rather good precision its “power”. This allows for example pilot jobs to match a task for which the required CPU-work is known, or to define the number of events to be processed knowing the CPU-work per event. Otherwise one always has the risk that the task is aborted because it exceeds the CPU capabilities of the resource. It also allows a better accounting of the consumed resources. The traditional way the CPU power is estimated in WLCG since 2007 is using the HEP-Spec06 benchmark (HS06) suite that was verified at the time to scale properly with a set of typical HEP applications. However, the hardware architecture of processors has evolved, all WLCG experiments moved to using 64-bit applications and use different compilation flags from those advertised for running HS06. It is therefore interesting to check the scaling of HS06 with the HEP applications. For this purpose, we have been using CPU intensive massive simulation productions from the LHCb experiment and compared their event throughput to the HS06 rating of the worker nodes. We also compared it with a much faster benchmark script that is used by the DIRAC framework used by LHCb for evaluating at run time the performance of the worker nodes. This contribution reports on the finding of these comparisons: the main observation is that the scaling with HS06 is no longer fulfilled, while the fast benchmarks have a better scaling but are less precise. One can also clearly see that some hardware or software features when enabled on the worker nodes may enhance their performance beyond expectation from either benchmark, depending on external factors.
Boyce, Maria B; Browne, John P; Greenhalgh, Joanne
2014-06-27
The use of patient-reported outcome measures (PROMs) to provide healthcare professionals with peer benchmarked feedback is growing. However, there is little evidence on the opinions of professionals on the value of this information in practice. The purpose of this research is to explore surgeon's experiences of receiving peer benchmarked PROMs feedback and to examine whether this information led to changes in their practice. This qualitative research employed a Framework approach. Semi-structured interviews were undertaken with surgeons who received peer benchmarked PROMs feedback. The participants included eleven consultant orthopaedic surgeons in the Republic of Ireland. Five themes were identified: conceptual, methodological, practical, attitudinal, and impact. A typology was developed based on the attitudinal and impact themes from which three distinct groups emerged. 'Advocates' had positive attitudes towards PROMs and confirmed that the information promoted a self-reflective process. 'Converts' were uncertain about the value of PROMs, which reduced their inclination to use the data. 'Sceptics' had negative attitudes towards PROMs and claimed that the information had no impact on their behaviour. The conceptual, methodological and practical factors were linked to the typology. Surgeons had mixed opinions on the value of peer benchmarked PROMs data. Many appreciated the feedback as it reassured them that their practice was similar to their peers. However, PROMs information alone was considered insufficient to help identify opportunities for quality improvements. The reasons for the observed reluctance of participants to embrace PROMs can be categorised into conceptual, methodological, and practical factors. Policy makers and researchers need to increase professionals' awareness of the numerous purposes and benefits of using PROMs, challenge the current methods to measure performance using PROMs, and reduce the burden of data collection and information dissemination on routine practice.
NASA Astrophysics Data System (ADS)
Park, E.; Jeong, J.
2017-12-01
A precise estimation of groundwater fluctuation is studied by considering delayed recharge flux (DRF) and unsaturated zone drainage (UZD). Both DRF and UZD are due to gravitational flow impeded in the unsaturated zone, which may nonnegligibly affect groundwater level changes. In the validation, a previous model without the consideration of unsaturated flow is benchmarked where the actual groundwater level and precipitation data are divided into three periods based on the climatic condition. The estimation capability of the new model is superior to the benchmarked model as indicated by the significantly improved representation of groundwater level with physically interpretable model parameters.
Emergency Management Benchmarking Study: Lessons for Increasing Supply Chain Resilience
2010-03-01
studied if public-private partnerships could improve community resilience . In essence they concluded that in order to achieve community resilience , public...improve community resilience in times of disaster. International Journal of Physical Distribution & Logistics Management, Vol. 39, No. 5, pp. 343
Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool
NASA Astrophysics Data System (ADS)
Torlapati, Jagadish; Prabhakar Clement, T.
2013-01-01
We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.
Analyzing the BBOB results by means of benchmarking concepts.
Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C
2015-01-01
We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.
ARRAYS OF BOTTLES OF PLUTONIUM NITRATE SOLUTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margaret A. Marshall
2012-09-01
In October and November of 1981 thirteen approaches-to-critical were performed on a remote split table machine (RSTM) in the Critical Mass Laboratory of Pacific Northwest Laboratory (PNL) in Richland, Washington using planar arrays of polyethylene bottles filled with plutonium (Pu) nitrate solution. Arrays of up to sixteen bottles were used to measure the critical number of bottles and critical array spacing with a tight fitting Plexiglas® reflector on all sides of the arrays except the top. Some experiments used Plexiglas shells fitted around each bottles to determine the effect of moderation on criticality. Each bottle contained approximately 2.4 L ofmore » Pu(NO3)4 solution with a Pu content of 105 g Pu/L and a free acid molarity H+ of 5.1. The plutonium was of low 240Pu (2.9 wt.%) content. These experiments were sponsored by Rockwell Hanford Operations because of the lack of experimental data on the criticality of arrays of bottles of Pu solution such as might be found in storage and handling at the Purex Facility at Hanford. The results of these experiments were used “to provide benchmark data to validate calculational codes used in criticality safety assessments of [the] plant configurations” (Ref. 1). Data for this evaluation were collected from the published report (Ref. 1), the approach to critical logbook, the experimenter’s logbook, and communication with the primary experimenter, B. Michael Durst. Of the 13 experiments preformed 10 were evaluated. One of the experiments was not evaluated because it had been thrown out by the experimenter, one was not evaluated because it was a repeat of another experiment and the third was not evaluated because it reported the critical number of bottles as being greater than 25. Seven of the thirteen evaluated experiments were determined to be acceptable benchmark experiments. A similar experiment using uranyl nitrate was benchmarked as U233-SOL-THERM-014.« less
NASA Astrophysics Data System (ADS)
Liu, Lei; Li, Zhi-Guo; Dai, Jia-Yu; Chen, Qi-Feng; Chen, Xiang-Rong
2018-06-01
Comprehensive knowledge of physical properties such as equation of state (EOS), proton exchange, dynamic structures, diffusion coefficients, and viscosities of hydrogen-deuterium mixtures with densities from 0.1 to 5 g /cm3 and temperatures from 1 to 50 kK has been presented via quantum molecular dynamics (QMD) simulations. The existing multi-shock experimental EOS provides an important benchmark to evaluate exchange-correlation functionals. The comparison of simulations with experiments indicates that a nonlocal van der Waals density functional (vdW-DF1) produces excellent results. Fraction analysis of molecules using a weighted integral over pair distribution functions was performed. A dissociation diagram together with a boundary where the proton exchange (H2+D2⇌2 HD ) occurs was generated, which shows evidence that the HD molecules form as the H2 and D2 molecules are almost 50% dissociated. The mechanism of proton exchange can be interpreted as a process of dissociation followed by recombination. The ionic structures at extreme conditions were analyzed by the effective coordination number model. High-order cluster, circle, and chain structures can be founded in the strongly coupled warm dense regime. The present QMD diffusion coefficient and viscosity can be used to benchmark two analytical one-component plasma (OCP) models: the Coulomb and Yukawa OCP models.
NASA Astrophysics Data System (ADS)
Giebel, Gregor; Cline, Joel; Frank, Helmut; Shaw, Will; Pinson, Pierre; Hodge, Bri-Mathias; Kariniotakis, Georges; Sempreviva, Anna Maria; Draxl, Caroline
2017-04-01
Wind power forecasts have been used operatively for over 20 years. Despite this fact, there are still several possibilities to improve the forecasts, both from the weather prediction side and from the usage of the forecasts. The new International Energy Agency (IEA) Task on Wind Power Forecasting tries to organise international collaboration, among national weather centres with an interest and/or large projects on wind forecast improvements (NOAA, DWD, UK MetOffice, …) and operational forecaster and forecast users. The Task is divided in three work packages: Firstly, a collaboration on the improvement of the scientific basis for the wind predictions themselves. This includes numerical weather prediction model physics, but also widely distributed information on accessible datasets for verification. Secondly, we will be aiming at an international pre-standard (an IEA Recommended Practice) on benchmarking and comparing wind power forecasts, including probabilistic forecasts aiming at industry and forecasters alike. This WP will also organise benchmarks, in cooperation with the IEA Task WakeBench. Thirdly, we will be engaging end users aiming at dissemination of the best practice in the usage of wind power predictions, especially probabilistic ones. The Operating Agent is Gregor Giebel of DTU, Co-Operating Agent is Joel Cline of the US Department of Energy. Collaboration in the task is solicited from everyone interested in the forecasting business. We will collaborate with IEA Task 31 Wakebench, which developed the Windbench benchmarking platform, which this task will use for forecasting benchmarks. The task runs for three years, 2016-2018. Main deliverables are an up-to-date list of current projects and main project results, including datasets which can be used by researchers around the world to improve their own models, an IEA Recommended Practice on performance evaluation of probabilistic forecasts, a position paper regarding the use of probabilistic forecasts, and one or more benchmark studies implemented on the Windbench platform hosted at CENER. Additionally, spreading of relevant information in both the forecasters and the users community is paramount. The poster also shows the work done in the first half of the Task, e.g. the collection of available datasets and the learnings from a public workshop on 9 June in Barcelona on Experiences with the Use of Forecasts and Gaps in Research. Participation is open for all interested parties in member states of the IEA Annex on Wind Power, see ieawind.org for the up-to-date list. For collaboration, please contact the author grgi@dtu.dk).
Validation of Shielding Analysis Capability of SuperMC with SINBAD
NASA Astrophysics Data System (ADS)
Chen, Chaobin; Yang, Qi; Wu, Bin; Han, Yuncheng; Song, Jing
2017-09-01
Abstract: The shielding analysis capability of SuperMC was validated with the Shielding Integral Benchmark Archive Database (SINBAD). The SINBAD was compiled by RSICC and NEA, it includes numerous benchmark experiments performed with the D-T fusion neutron source facilities of OKTAVIAN, FNS, IPPE, etc. The results from SuperMC simulation were compared with experimental data and MCNP results. Very good agreement with deviation lower than 1% was achieved and it suggests that SuperMC is reliable in shielding calculation.
Is It Time To Consider Global Sharing of Integral Physics Data?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harold F. McFarlane
The innocent days of the Atoms for Peace program vanished with the suicide attack on the World Trade Center in New York City that occurred while the GLOBAL 2001 international nuclear fuel cycle conference was convened in Paris. Today’s reality is that maintaining an inventory of unirradiated highly enriched uranium or plutonium for critical experiments requires a facility to accept substantial security cost and intrusion. In the context of a large collection of benchmark integral experiments collected over several decades and the ongoing rapid advances in computer modeling and simulation, there seems to be ample incentive to reduce both themore » number of facilities and material inventory quantities worldwide. As a result of ongoing nonproliferation initiatives, there are viable programs that will accept highly enriched uranium for down blending into commercial fuel. Nevertheless, there are formidable hurdles to overcome before national institutions will voluntarily give up existing nuclear research capabilities. GLOBAL 2005 was the appropriate forum to begin fostering a new spirit of cooperation that could lead to improved international security and better use of precious research and development resources, while ensuring access to existing and future critical experiment data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng Hansheng
The ICF Program in China has made significant progress with multilabs' efforts in the past years. The eight-beam SG-II laser facility, upgraded from the two-beam SG-I facility, is nearly completed for 1.05 {mu}m light output and is about to be operated for experiments. Some benchmark experiments have been conducted for disk targets. Advanced diagnostic techniques, such as an x-ray microscope with a 7-{mu}m spatial resolution and x-ray framing cameras with a temporal resolution better than 65ps, have been developed. Lower energy pumping with prepulse technique for Ne-like Ti laser at 32.6nm has succeeded and shadowgraphy of a fine mesh hasmore » been demonstrated with the Ti laser beam. A national project, SG-III laser facility, has been proposed to produce 60 kJ blue light for target physics experiments and is being conceptually designed. New laser technology, including maltipass amplification, large aperture plasma electrode switches and laser glass with fewer platinum grains have been developed to meet the requirements of the SG-III Project. The Technical Integration Line (TIL) as a scientific prototype beamlet of SG-III will be first built in the next few years.« less
Gatemon Benchmarking and Two-Qubit Operation
NASA Astrophysics Data System (ADS)
Casparis, Lucas; Larsen, Thorvald; Olsen, Michael; Petersson, Karl; Kuemmeth, Ferdinand; Krogstrup, Peter; Nygard, Jesper; Marcus, Charles
Recent experiments have demonstrated superconducting transmon qubits with semiconductor nanowire Josephson junctions. These hybrid gatemon qubits utilize field effect tunability singular to semiconductors to allow complete qubit control using gate voltages, potentially a technological advantage over conventional flux-controlled transmons. Here, we present experiments with a two-qubit gatemon circuit. We characterize qubit coherence and stability and use randomized benchmarking to demonstrate single-qubit gate errors of ~0.5 % for all gates, including voltage-controlled Z rotations. We show coherent capacitive coupling between two gatemons and coherent SWAP operations. Finally, we perform a two-qubit controlled-phase gate with an estimated fidelity of ~91 %, demonstrating the potential of gatemon qubits for building scalable quantum processors. We acknowledge financial support from Microsoft Project Q and the Danish National Research Foundation.
NASA Astrophysics Data System (ADS)
Steefel, C. I.
2015-12-01
Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.
Can Youth Sport Build Character?
ERIC Educational Resources Information Center
Shields, David Light; Bredemeier, Brenda Light; Power, F. Clark
2001-01-01
Participation and competition in some sports are associated with lower stages of moral reasoning. Coaches can foster moral development by starting with the right mental model, holding benchmark meetings about team values, setting goals for physical and character skills, making time for guided discussion sessions, building community, modeling…
FDNS CFD Code Benchmark for RBCC Ejector Mode Operation
NASA Technical Reports Server (NTRS)
Holt, James B.; Ruf, Joe
1999-01-01
Computational Fluid Dynamics (CFD) analysis results are compared with benchmark quality test data from the Propulsion Engineering Research Center's (PERC) Rocket Based Combined Cycle (RBCC) experiments to verify fluid dynamic code and application procedures. RBCC engine flowpath development will rely on CFD applications to capture the multi-dimensional fluid dynamic interactions and to quantify their effect on the RBCC system performance. Therefore, the accuracy of these CFD codes must be determined through detailed comparisons with test data. The PERC experiments build upon the well-known 1968 rocket-ejector experiments of Odegaard and Stroup by employing advanced optical and laser based diagnostics to evaluate mixing and secondary combustion. The Finite Difference Navier Stokes (FDNS) code was used to model the fluid dynamics of the PERC RBCC ejector mode configuration. Analyses were performed for both Diffusion and Afterburning (DAB) and Simultaneous Mixing and Combustion (SMC) test conditions. Results from both the 2D and the 3D models are presented.
Radiation Detection Computational Benchmark Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing differentmore » techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for compilation. This is a report describing the details of the selected Benchmarks and results from various transport codes.« less
Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective
NASA Astrophysics Data System (ADS)
Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.
2017-12-01
The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.
Investigation of the RbCa molecule: Experiment and theory.
Pototschnig, Johann V; Krois, Günter; Lackner, Florian; Ernst, Wolfgang E
2015-04-01
We present a thorough theoretical and experimental study of the electronic structure of RbCa. The mixed alkali-alkaline earth molecule RbCa was formed on superfluid helium nanodroplets. Excited states of the molecule in the range of 13 000-23 000 cm -1 were recorded by resonance enhanced multi-photon ionization time-of-flight spectroscopy. The experiment is accompanied by high level ab initio calculations of ground and excited state properties, utilizing a multireference configuration interaction method based on multiconfigurational self consistent field calculations. With this approach the potential energy curves and permanent electric dipole moments of 24 electronic states were calculated. In addition we computed the transition dipole moments for transitions from the ground into excited states. The combination of experiment and theory allowed the assignment of features in the recorded spectrum to the excited [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], and [Formula: see text] states, where the experiment allowed to benchmark the calculation. This is the first experimental work giving insight into the previously unknown RbCa molecule, which offers great prospects in ultracold molecular physics due to its magnetic and electronic dipole moment in the [Formula: see text] ground state.
Finite-volume and partial quenching effects in the magnetic polarizability of the neutron
NASA Astrophysics Data System (ADS)
Hall, J. M. M.; Leinweber, D. B.; Young, R. D.
2014-03-01
There has been much progress in the experimental measurement of the electric and magnetic polarizabilities of the nucleon. Similarly, lattice QCD simulations have recently produced dynamical QCD results for the magnetic polarizability of the neutron approaching the chiral regime. In order to compare the lattice simulations with experiment, calculation of partial quenching and finite-volume effects is required prior to an extrapolation in quark mass to the physical point. These dependencies are described using chiral effective field theory. Corrections to the partial quenching effects associated with the sea-quark-loop electric charges are estimated by modeling corrections to the pion cloud. These are compared to the uncorrected lattice results. In addition, the behavior of the finite-volume corrections as a function of pion mass is explored. Box sizes of approximately 7 fm are required to achieve a result within 5% of the infinite-volume result at the physical pion mass. A variety of extrapolations are shown at different box sizes, providing a benchmark to guide future lattice QCD calculations of the magnetic polarizabilities. A relatively precise value for the physical magnetic polarizability of the neutron is presented, βn=1.93(11)stat(11)sys×10-4 fm3, which is in agreement with current experimental results.
Xenakis, A M; Lind, S J; Stansby, P K; Rogers, B D
2017-03-01
Tsunamis caused by landslides may result in significant destruction of the surroundings with both societal and industrial impact. The 1958 Lituya Bay landslide and tsunami is a recent and well-documented terrestrial landslide generating a tsunami with a run-up of 524 m. Although recent computational techniques have shown good performance in the estimation of the run-up height, they fail to capture all the physical processes, in particular, the landslide-entry profile and interaction with the water. Smoothed particle hydrodynamics (SPH) is a versatile numerical technique for describing free-surface and multi-phase flows, particularly those that exhibit highly nonlinear deformation in landslide-generated tsunamis. In the current work, the novel multi-phase incompressible SPH method with shifting is applied to the Lituya Bay tsunami and landslide and is the first methodology able to reproduce realistically both the run-up and landslide-entry as documented in a benchmark experiment. The method is the first paper to develop a realistic implementation of the physics that in addition to the non-Newtonian rheology of the landslide includes turbulence in the water phase and soil saturation. Sensitivity to the experimental initial conditions is also considered. This work demonstrates the ability of the proposed method in modelling challenging environmental multi-phase, non-Newtonian and turbulent flows.
Lind, S. J.; Stansby, P. K.; Rogers, B. D.
2017-01-01
Tsunamis caused by landslides may result in significant destruction of the surroundings with both societal and industrial impact. The 1958 Lituya Bay landslide and tsunami is a recent and well-documented terrestrial landslide generating a tsunami with a run-up of 524 m. Although recent computational techniques have shown good performance in the estimation of the run-up height, they fail to capture all the physical processes, in particular, the landslide-entry profile and interaction with the water. Smoothed particle hydrodynamics (SPH) is a versatile numerical technique for describing free-surface and multi-phase flows, particularly those that exhibit highly nonlinear deformation in landslide-generated tsunamis. In the current work, the novel multi-phase incompressible SPH method with shifting is applied to the Lituya Bay tsunami and landslide and is the first methodology able to reproduce realistically both the run-up and landslide-entry as documented in a benchmark experiment. The method is the first paper to develop a realistic implementation of the physics that in addition to the non-Newtonian rheology of the landslide includes turbulence in the water phase and soil saturation. Sensitivity to the experimental initial conditions is also considered. This work demonstrates the ability of the proposed method in modelling challenging environmental multi-phase, non-Newtonian and turbulent flows. PMID:28413334
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Leland M. Montierth
2013-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » One benchmark experiment was evaluated in this report: Core 4. Core 4 represents the only configuration with random pebble packing in the HTR-PROTEUS series of experiments, and has a moderator-to-fuel pebble ratio of 1:1. Three random configurations were performed. The initial configuration, Core 4.1, was rejected because the method for pebble loading, separate delivery tubes for the moderator and fuel pebbles, may not have been completely random; this core loading was rejected by the experimenters. Cores 4.2 and 4.3 were loaded using a single delivery tube, eliminating the possibility for systematic ordering effects. The second and third cores differed slightly in the quantity of pebbles loaded (40 each of moderator and fuel pebbles), stacked height of the pebbles in the core cavity (0.02 m), withdrawn distance of the stainless steel control rods (20 mm), and withdrawn distance of the autorod (30 mm). The 34 coolant channels in the upper axial reflector and the 33 coolant channels in the lower axial reflector were open. Additionally, the axial graphite fillers used in all other HTR-PROTEUS configurations to create a 12-sided core cavity were not used in the randomly packed cores. Instead, graphite fillers were placed on the cavity floor, creating a funnel-like base, to discourage ordering effects during pebble loading. Core 4 was determined to be acceptable benchmark experiment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Montierth, Leland M.; Sterbentz, James W.
2014-03-01
In its deployment as a pebble bed reactor (PBR) critical facility from 1992 to 1996, the PROTEUS facility was designated as HTR-PROTEUS. This experimental program was performed as part of an International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on the Validation of Safety Related Physics Calculations for Low Enriched HTGRs. Within this project, critical experiments were conducted for graphite moderated LEU systems to determine core reactivity, flux and power profiles, reaction-rate ratios, the worth of control rods, both in-core and reflector based, the worth of burnable poisons, kinetic parameters, and the effects of moisture ingress on these parameters.more » One benchmark experiment was evaluated in this report: Core 4. Core 4 represents the only configuration with random pebble packing in the HTR-PROTEUS series of experiments, and has a moderator-to-fuel pebble ratio of 1:1. Three random configurations were performed. The initial configuration, Core 4.1, was rejected because the method for pebble loading, separate delivery tubes for the moderator and fuel pebbles, may not have been completely random; this core loading was rejected by the experimenters. Cores 4.2 and 4.3 were loaded using a single delivery tube, eliminating the possibility for systematic ordering effects. The second and third cores differed slightly in the quantity of pebbles loaded (40 each of moderator and fuel pebbles), stacked height of the pebbles in the core cavity (0.02 m), withdrawn distance of the stainless steel control rods (20 mm), and withdrawn distance of the autorod (30 mm). The 34 coolant channels in the upper axial reflector and the 33 coolant channels in the lower axial reflector were open. Additionally, the axial graphite fillers used in all other HTR-PROTEUS configurations to create a 12-sided core cavity were not used in the randomly packed cores. Instead, graphite fillers were placed on the cavity floor, creating a funnel-like base, to discourage ordering effects during pebble loading. Core 4 was determined to be acceptable benchmark experiment.« less
Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.
Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan
2017-09-01
In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.
Model Prediction Results for 2007 Ultrasonic Benchmark Problems
NASA Astrophysics Data System (ADS)
Kim, Hak-Joon; Song, Sung-Jin
2008-02-01
The World Federation of NDE Centers (WFNDEC) has addressed two types of problems for the 2007 ultrasonic benchmark problems: prediction of side-drilled hole responses with 45° and 60° refracted shear waves, and effects of surface curvatures on the ultrasonic responses of flat-bottomed hole. To solve this year's ultrasonic benchmark problems, we applied multi-Gaussian beam models for calculation of ultrasonic beam fields and the Kirchhoff approximation and the separation of variables method for calculation of far-field scattering amplitudes of flat-bottomed holes and side-drilled holes respectively In this paper, we present comparison results of model predictions to experiments for side-drilled holes and discuss effect of interface curvatures on ultrasonic responses by comparison of peak-to-peak amplitudes of flat-bottomed hole responses with different sizes and interface curvatures.
Voss, Clifford I.; Simmons, Craig T.; Robinson, Neville I.
2010-01-01
This benchmark for three-dimensional (3D) numerical simulators of variable-density groundwater flow and solute or energy transport consists of matching simulation results with the semi-analytical solution for the transition from one steady-state convective mode to another in a porous box. Previous experimental and analytical studies of natural convective flow in an inclined porous layer have shown that there are a variety of convective modes possible depending on system parameters, geometry and inclination. In particular, there is a well-defined transition from the helicoidal mode consisting of downslope longitudinal rolls superimposed upon an upslope unicellular roll to a mode consisting of purely an upslope unicellular roll. Three-dimensional benchmarks for variable-density simulators are currently (2009) lacking and comparison of simulation results with this transition locus provides an unambiguous means to test the ability of such simulators to represent steady-state unstable 3D variable-density physics.
Wheeler, Matthew W; Bailer, A John
2007-06-01
Model averaging (MA) has been proposed as a method of accounting for model uncertainty in benchmark dose (BMD) estimation. The technique has been used to average BMD dose estimates derived from dichotomous dose-response experiments, microbial dose-response experiments, as well as observational epidemiological studies. While MA is a promising tool for the risk assessor, a previous study suggested that the simple strategy of averaging individual models' BMD lower limits did not yield interval estimators that met nominal coverage levels in certain situations, and this performance was very sensitive to the underlying model space chosen. We present a different, more computationally intensive, approach in which the BMD is estimated using the average dose-response model and the corresponding benchmark dose lower bound (BMDL) is computed by bootstrapping. This method is illustrated with TiO(2) dose-response rat lung cancer data, and then systematically studied through an extensive Monte Carlo simulation. The results of this study suggest that the MA-BMD, estimated using this technique, performs better, in terms of bias and coverage, than the previous MA methodology. Further, the MA-BMDL achieves nominal coverage in most cases, and is superior to picking the "best fitting model" when estimating the benchmark dose. Although these results show utility of MA for benchmark dose risk estimation, they continue to highlight the importance of choosing an adequate model space as well as proper model fit diagnostics.
Nonlinear viscoplasticity in ASPECT: benchmarking and applications to subduction
NASA Astrophysics Data System (ADS)
Glerum, Anne; Thieulot, Cedric; Fraters, Menno; Blom, Constantijn; Spakman, Wim
2018-03-01
ASPECT (Advanced Solver for Problems in Earth's ConvecTion) is a massively parallel finite element code originally designed for modeling thermal convection in the mantle with a Newtonian rheology. The code is characterized by modern numerical methods, high-performance parallelism and extensibility. This last characteristic is illustrated in this work: we have extended the use of ASPECT from global thermal convection modeling to upper-mantle-scale applications of subduction. Subduction modeling generally requires the tracking of multiple materials with different properties and with nonlinear viscous and viscoplastic rheologies. To this end, we implemented a frictional plasticity criterion that is combined with a viscous diffusion and dislocation creep rheology. Because ASPECT uses compositional fields to represent different materials, all material parameters are made dependent on a user-specified number of fields. The goal of this paper is primarily to describe and verify our implementations of complex, multi-material rheology by reproducing the results of four well-known two-dimensional benchmarks: the indentor benchmark, the brick experiment, the sandbox experiment and the slab detachment benchmark. Furthermore, we aim to provide hands-on examples for prospective users by demonstrating the use of multi-material viscoplasticity with three-dimensional, thermomechanical models of oceanic subduction, putting ASPECT on the map as a community code for high-resolution, nonlinear rheology subduction modeling.
NASA Astrophysics Data System (ADS)
2000-07-01
Among the initiatives to be found at UK universities is a vocational award with the title `University Foundation Degree' at Nottingham Trent University. This qualification will be offered in 14 different subjects including four in the Faculty of Science and Mathematics, in the areas of applied biology, applied sciences, chemistry and physics. The courses will be available on a two-year full-time, three-year sandwich or a part-time basis. Set at a higher standard and specification than the Higher National Diplomas which it replaces, the UFD has been devised in consultation with industry and will cover the technical and specialist areas demanded by employers to combat skills shortages. The UFD in applied sciences concentrates on practical applications through laboratory, IT and project work, supported by lectures and seminars. At the end students can enter the employment market or transfer onto the second year of a degree course. Science-based careers including research and development would be the aim of those taking the UFD in physics. The first year develops the fundamentals of modern physics supported by studies in mathematics, IT and computer programming, whilst year 2 is vocational in nature with industrial problem solving and work experience as well as an academic theme associated with environmental aspects of the subject. Those who complete the UFD will be allowed automatic progression to a specified honours degree course and would normally be expected to study for a further two years for this award. However, those demonstrating an outstanding academic performance can transfer to the linked degree programme at the end of the first year via fast-track modules. Back in May the UK's Quality Assurance Agency (QAA) announced new standard benchmarks for degrees. These will be introduced into higher education institutions from 2002 to outline the knowledge, understanding and skills a student should gain from a particular higher education course. These benchmark statements should help students to make informed choices about their degree and subsequent employability, as well as informing employers about the skills and knowledge of the graduates they propose to employ. Academics from each discipline have agreed the statements for their areas of expertise to a common framework.
Multi-hadron spectroscopy in a large physical volume
NASA Astrophysics Data System (ADS)
Bulava, John; Hörz, Ben; Morningstar, Colin
2018-03-01
We demonstrate the effcacy of the stochastic LapH method to treat all-toall quark propagation on a Nf = 2 + 1 CLS ensemble with large linear spatial extent L = 5:5 fm, allowing us to obtain the benchmark elastic isovector p-wave pion-pion scattering amplitude to good precision already on a relatively small number of gauge configurations. These results hold promise for multi-hadron spectroscopy at close-to-physical pion mass with exponential finite-volume effects under control.
Finding undetected protein associations in cell signaling by belief propagation.
Bailly-Bechet, M; Borgs, C; Braunstein, A; Chayes, J; Dagkessamanskaia, A; François, J-M; Zecchina, R
2011-01-11
External information propagates in the cell mainly through signaling cascades and transcriptional activation, allowing it to react to a wide spectrum of environmental changes. High-throughput experiments identify numerous molecular components of such cascades that may, however, interact through unknown partners. Some of them may be detected using data coming from the integration of a protein-protein interaction network and mRNA expression profiles. This inference problem can be mapped onto the problem of finding appropriate optimal connected subgraphs of a network defined by these datasets. The optimization procedure turns out to be computationally intractable in general. Here we present a new distributed algorithm for this task, inspired from statistical physics, and apply this scheme to alpha factor and drug perturbations data in yeast. We identify the role of the COS8 protein, a member of a gene family of previously unknown function, and validate the results by genetic experiments. The algorithm we present is specially suited for very large datasets, can run in parallel, and can be adapted to other problems in systems biology. On renowned benchmarks it outperforms other algorithms in the field.
NASA Technical Reports Server (NTRS)
Chen, Yen-Sen; Liu, Jiwen; Wei, Hong
2000-01-01
The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. The principle of the laser power propulsion is that when high-powered laser is focused at a small area near the surface of a thruster, the intense energy causes the electrical breakdown of the working fluid (e.g. air) and forming high speed plasma (known as the inverse Bremsstrahlung, IB, effect). The intense heat and high pressure created in the plasma consequently causes the surrounding to heat up and expand until the thrust producing shock waves are formed. This complex process of gas ionization, increase in radiation absorption and the forming of plasma and shock waves will be investigated in the development of the present numerical model. In the first phase of this study, laser light focusing, radiation absorption and shock wave propagation over the entire pulsed cycle are modeled. The model geometry and test conditions of known benchmark experiments such as those in Myrabo's experiment will be employed in the numerical model validation simulations. The calculated performance data will be compared to the test data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He + T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T + T fusion. Here, we present a new analysis of the 3He + 3He proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
NASA Astrophysics Data System (ADS)
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.; Hale, G. M.; Brune, C. R.; Bacher, A.; Casey, D. T.; Li, C. K.; McNabb, D.; Paris, M.; Petrasso, R. D.; Sangster, T. C.; Sayre, D. B.; Séguin, F. H.
2017-12-01
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He +T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T +T fusion. We present a new analysis of the 3He + 3He 3 proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
NASA Astrophysics Data System (ADS)
Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi
2018-03-01
We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.
Enhancing neonatal wellness with home visitation.
Parker, Carlo; Warmuskerken, Geene; Sinclair, Lorna
2015-01-01
We planned and implemented an evidence-based program to screen for jaundice and to try to increase the proportion of women breastfeeding for 6 months. The program involved home visitation by a registered nurse to provide education on and support of breastfeeding, and to perform physical assessment of both mothers and newborns, including screening for neonatal jaundice. Quantitative data showed increased breastfeeding rates at 6 months. In addition, readmission rates for jaundice were higher when compared to regional benchmarks. However, the average length of stay for treatment of jaundice was shorter than regional benchmarks. Qualitative data indicated that the program was effective at achieving its goals and was valued by participants. © 2015 AWHONN.
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.; ...
2017-11-29
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He + T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T + T fusion. Here, we present a new analysis of the 3He + 3He proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
The benchmark aeroelastic models program: Description and highlights of initial results
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Eckstrom, Clinton V.; Rivera, Jose A., Jr.; Dansberry, Bryan E.; Farmer, Moses G.; Durham, Michael H.
1991-01-01
An experimental effort was implemented in aeroelasticity called the Benchmark Models Program. The primary purpose of this program is to provide the necessary data to evaluate computational fluid dynamic codes for aeroelastic analysis. It also focuses on increasing the understanding of the physics of unsteady flows and providing data for empirical design. An overview is given of this program and some results obtained in the initial tests are highlighted. The tests that were completed include measurement of unsteady pressures during flutter of rigid wing with a NACA 0012 airfoil section and dynamic response measurements of a flexible rectangular wing with a thick circular arc airfoil undergoing shock boundary layer oscillations.
Adsorption structures and energetics of molecules on metal surfaces: Bridging experiment and theory
NASA Astrophysics Data System (ADS)
Maurer, Reinhard J.; Ruiz, Victor G.; Camarillo-Cisneros, Javier; Liu, Wei; Ferri, Nicola; Reuter, Karsten; Tkatchenko, Alexandre
2016-05-01
Adsorption geometry and stability of organic molecules on surfaces are key parameters that determine the observable properties and functions of hybrid inorganic/organic systems (HIOSs). Despite many recent advances in precise experimental characterization and improvements in first-principles electronic structure methods, reliable databases of structures and energetics for large adsorbed molecules are largely amiss. In this review, we present such a database for a range of molecules adsorbed on metal single-crystal surfaces. The systems we analyze include noble-gas atoms, conjugated aromatic molecules, carbon nanostructures, and heteroaromatic compounds adsorbed on five different metal surfaces. The overall objective is to establish a diverse benchmark dataset that enables an assessment of current and future electronic structure methods, and motivates further experimental studies that provide ever more reliable data. Specifically, the benchmark structures and energetics from experiment are here compared with the recently developed van der Waals (vdW) inclusive density-functional theory (DFT) method, DFT + vdWsurf. In comparison to 23 adsorption heights and 17 adsorption energies from experiment we find a mean average deviation of 0.06 Å and 0.16 eV, respectively. This confirms the DFT + vdWsurf method as an accurate and efficient approach to treat HIOSs. A detailed discussion identifies remaining challenges to be addressed in future development of electronic structure methods, for which the here presented benchmark database may serve as an important reference.
Development of risk-based nanomaterial groups for occupational exposure control
NASA Astrophysics Data System (ADS)
Kuempel, E. D.; Castranova, V.; Geraci, C. L.; Schulte, P. A.
2012-09-01
Given the almost limitless variety of nanomaterials, it will be virtually impossible to assess the possible occupational health hazard of each nanomaterial individually. The development of science-based hazard and risk categories for nanomaterials is needed for decision-making about exposure control practices in the workplace. A possible strategy would be to select representative (benchmark) materials from various mode of action (MOA) classes, evaluate the hazard and develop risk estimates, and then apply a systematic comparison of new nanomaterials with the benchmark materials in the same MOA class. Poorly soluble particles are used here as an example to illustrate quantitative risk assessment methods for possible benchmark particles and occupational exposure control groups, given mode of action and relative toxicity. Linking such benchmark particles to specific exposure control bands would facilitate the translation of health hazard and quantitative risk information to the development of effective exposure control practices in the workplace. A key challenge is obtaining sufficient dose-response data, based on standard testing, to systematically evaluate the nanomaterials' physical-chemical factors influencing their biological activity. Categorization processes involve both science-based analyses and default assumptions in the absence of substance-specific information. Utilizing data and information from related materials may facilitate initial determinations of exposure control systems for nanomaterials.
New super-computing facility in RIKEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohta, Shigemi
1994-12-31
A new superconductor, Fujitsu VPP500/28, was installed in the Institute of Physical and Chemical Research (RIKEN) at the end of March, 1994. It consists of 28 processing elements (PE`s) connected by a high-speed crossbar switch. The switch is a combination of GaAs and ECL circuitry with peak band width of 800 Mbyte per second. Each PE consists of a GaAs/ECL vector processor with 1.6 Gflops peak speed and 256 Mbyte SRAM local memory. In addition, there are 8 GByte DRAM space, two 100 Gbyte RAID disks and a 10 TByte archive based on SONY File Bank system. The author ranmore » three major benchmarks on this machine: modified LINPACK, lattice QCD and FFT. In the modified LINPACK benchmark, a sustained speed of about 28 Gflops is achieved, by removing the restriction on the size of the matrices. In the lattice QCD benchmark, a sustained speed of about 30 Gflops is achieved for inverting staggered fermion propagation matrix on a 32{sup 4} lattice. In the FFT benchmark, real data of 32, 128, 512, and 2048 MByte are Fourier-transformed. The sustained speed for each is respectively 21, 21, 20, and 19 Gflops. The numbers are obtained after only a few weeks of coding efforts and can be improved further.« less
Energy benchmarking of commercial buildings: a low-cost pathway toward urban sustainability
NASA Astrophysics Data System (ADS)
Cox, Matt; Brown, Marilyn A.; Sun, Xiaojing
2013-09-01
US cities are beginning to experiment with a regulatory approach to address information failures in the real estate market by mandating the energy benchmarking of commercial buildings. Understanding how a commercial building uses energy has many benefits; for example, it helps building owners and tenants identify poor-performing buildings and subsystems and it enables high-performing buildings to achieve greater occupancy rates, rents, and property values. This paper estimates the possible impacts of a national energy benchmarking mandate through analysis chiefly utilizing the Georgia Tech version of the National Energy Modeling System (GT-NEMS). Correcting input discount rates results in a 4.0% reduction in projected energy consumption for seven major classes of equipment relative to the reference case forecast in 2020, rising to 8.7% in 2035. Thus, the official US energy forecasts appear to overestimate future energy consumption by underestimating investments in energy-efficient equipment. Further discount rate reductions spurred by benchmarking policies yield another 1.3-1.4% in energy savings in 2020, increasing to 2.2-2.4% in 2035. Benchmarking would increase the purchase of energy-efficient equipment, reducing energy bills, CO2 emissions, and conventional air pollution. Achieving comparable CO2 savings would require more than tripling existing US solar capacity. Our analysis suggests that nearly 90% of the energy saved by a national benchmarking policy would benefit metropolitan areas, and the policy’s benefits would outweigh its costs, both to the private sector and society broadly.
Benchmark study on glyphosate-resistant crop systems in the United States. Part 2: Perspectives.
Owen, Micheal D K; Young, Bryan G; Shaw, David R; Wilson, Robert G; Jordan, David L; Dixon, Philip M; Weller, Stephen C
2011-07-01
A six-state, 5 year field project was initiated in 2006 to study weed management methods that foster the sustainability of genetically engineered (GE) glyphosate-resistant (GR) crop systems. The benchmark study field-scale experiments were initiated following a survey, conducted in the winter of 2005-2006, of farmer opinions on weed management practices and their views on GR weeds and management tactics. The main survey findings supported the premise that growers were generally less aware of the significance of evolved herbicide resistance and did not have a high recognition of the strong selection pressure from herbicides on the evolution of herbicide-resistant (HR) weeds. The results of the benchmark study survey indicated that there are educational challenges to implement sustainable GR-based crop systems and helped guide the development of the field-scale benchmark study. Paramount is the need to develop consistent and clearly articulated science-based management recommendations that enable farmers to reduce the potential for HR weeds. This paper provides background perspectives about the use of GR crops, the impact of these crops and an overview of different opinions about the use of GR crops on agriculture and society, as well as defining how the benchmark study will address these issues. Copyright © 2011 Society of Chemical Industry.
TRIPOLI-4® - MCNP5 ITER A-lite neutronic model benchmarking
NASA Astrophysics Data System (ADS)
Jaboulay, J.-C.; Cayla, P.-Y.; Fausser, C.; Lee, Y.-K.; Trama, J.-C.; Li-Puma, A.
2014-06-01
The aim of this paper is to present the capability of TRIPOLI-4®, the CEA Monte Carlo code, to model a large-scale fusion reactor with complex neutron source and geometry. In the past, numerous benchmarks were conducted for TRIPOLI-4® assessment on fusion applications. Experiments (KANT, OKTAVIAN, FNG) analysis and numerical benchmarks (between TRIPOLI-4® and MCNP5) on the HCLL DEMO2007 and ITER models were carried out successively. In this previous ITER benchmark, nevertheless, only the neutron wall loading was analyzed, its main purpose was to present MCAM (the FDS Team CAD import tool) extension for TRIPOLI-4®. Starting from this work a more extended benchmark has been performed about the estimation of neutron flux, nuclear heating in the shielding blankets and tritium production rate in the European TBMs (HCLL and HCPB) and it is presented in this paper. The methodology to build the TRIPOLI-4® A-lite model is based on MCAM and the MCNP A-lite model (version 4.1). Simplified TBMs (from KIT) have been integrated in the equatorial-port. Comparisons of neutron wall loading, flux, nuclear heating and tritium production rate show a good agreement between the two codes. Discrepancies are mainly included in the Monte Carlo codes statistical error.
Hogan, Bridget; Keating, Matthew; Chambers, Neil A; von Ungern-Sternberg, Britta
2016-05-01
There are no internationally accepted guidelines about what constitutes adequate clinical exposure during pediatric anesthetic training. In Australia, no data have been published on the level of experience obtained by anesthetic trainees in pediatric anesthesia. There is, however, a new ANZCA (Australian and New Zealand College of Anaesthetists) curriculum that quantifies new training requirements. To quantify our trainees' exposure to clinical work in order to assess compliance with new curriculum and to provide other institutions with a benchmark for pediatric anesthetic training. We performed a prospective audit to estimate and quantify our anesthetic registrars' exposure to pediatric anesthesia during their 6-month rotation at our institution, a tertiary pediatric hospital in Perth, Western Australia. Our data suggest that trainees at our institution will achieve the new ANZCA training standards comfortably, in terms of the required volume and breadth of exposure. Experience, however, of some advanced pediatric anesthetic procedures appears limited. Experience gained at our hospital easily meets the new College requirements. Experience of fiber-optic intubation and regional blocks would appear insufficient to develop sufficient skills or confidence. The study provides other institutions with information to benchmark against their own trainee experience. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Grenier, Christophe; Roux, Nicolas; Anbergen, Hauke; Collier, Nathaniel; Costard, Francois; Ferrry, Michel; Frampton, Andrew; Frederick, Jennifer; Holmen, Johan; Jost, Anne; Kokh, Samuel; Kurylyk, Barret; McKenzie, Jeffrey; Molson, John; Orgogozo, Laurent; Rivière, Agnès; Rühaak, Wolfram; Selroos, Jan-Olof; Therrien, René; Vidstrand, Patrik
2015-04-01
The impacts of climate change in boreal regions has received considerable attention recently due to the warming trends that have been experienced in recent decades and are expected to intensify in the future. Large portions of these regions, corresponding to permafrost areas, are covered by water bodies (lakes, rivers) that interact with the surrounding permafrost. For example, the thermal state of the surrounding soil influences the energy and water budget of the surface water bodies. Also, these water bodies generate taliks (unfrozen zones below) that disturb the thermal regimes of permafrost and may play a key role in the context of climate change. Recent field studies and modeling exercises indicate that a fully coupled 2D or 3D Thermo-Hydraulic (TH) approach is required to understand and model the past and future evolution of landscapes, rivers, lakes and associated groundwater systems in a changing climate. However, there is presently a paucity of 3D numerical studies of permafrost thaw and associated hydrological changes, and the lack of study can be partly attributed to the difficulty in verifying multi-dimensional results produced by numerical models. Numerical approaches can only be validated against analytical solutions for a purely thermic 1D equation with phase change (e.g. Neumann, Lunardini). When it comes to the coupled TH system (coupling two highly non-linear equations), the only possible approach is to compare the results from different codes to provided test cases and/or to have controlled experiments for validation. Such inter-code comparisons can propel discussions to try to improve code performances. A benchmark exercise was initialized in 2014 with a kick-off meeting in Paris in November. Participants from USA, Canada, Germany, Sweden and France convened, representing altogether 13 simulation codes. The benchmark exercises consist of several test cases inspired by existing literature (e.g. McKenzie et al., 2007) as well as new ones. They range from simpler, purely thermal cases (benchmark T1) to more complex, coupled 2D TH cases (benchmarks TH1, TH2, and TH3). Some experimental cases conducted in cold room complement the validation approach. A web site hosted by LSCE (Laboratoire des Sciences du Climat et de l'Environnement) is an interaction platform for the participants and hosts the test cases database at the following address: https://wiki.lsce.ipsl.fr/interfrost. The results of the first stage of the benchmark exercise will be presented. We will mainly focus on the inter-comparison of participant results for the coupled cases (TH1, TH2 & TH3). Further perspectives of the exercise will also be presented. Extensions to more complex physical conditions (e.g. unsaturated conditions and geometrical deformations) are contemplated. In addition, 1D vertical cases of interest to the Climate Modeling community will be proposed. Keywords: Permafrost; Numerical modeling; River-soil interaction; Arctic systems; soil freeze-thaw
Building Dynamic Conceptual Physics Understanding
ERIC Educational Resources Information Center
Trout, Charlotte; Sinex, Scott A.; Ragan, Susan
2011-01-01
Models are essential to the learning and doing of science, and systems thinking is key to appreciating many environmental issues. The National Science Education Standards include models and systems in their unifying concepts and processes standard, while the AAAS Benchmarks include them in their common themes chapter. Hyerle and Marzano argue for…
Assessment of soil health in the central claypan region, Missouri
USDA-ARS?s Scientific Manuscript database
Assessment of soil health involves determining how well a soil is performing its biological, chemical, and physical functions relative to its inherent potential. Within the Central Claypan Region of Missouri, the Salt River Basin was selected as a benchmark watershed to assess long-term effects of c...
Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...
Benchmark studies of induced radioactivity produced in LHC materials, Part II: Remanent dose rates.
Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H
2005-01-01
A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of, and laterally to, a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV c(-1). Emphasis was put on the reduction of uncertainties by taking measures such as careful monitoring of the irradiation parameters, using different instruments to measure dose rates, adopting detailed elemental analyses of the irradiated materials and making detailed simulations of the irradiation experiment. The measured and calculated dose rates are in good agreement.
Physical and numerical studies of a fracture system model
NASA Astrophysics Data System (ADS)
Piggott, Andrew R.; Elsworth, Derek
1989-03-01
Physical and numerical studies of transient flow in a model of discretely fractured rock are presented. The physical model is a thermal analogue to fractured media flow consisting of idealized disc-shaped fractures. The numerical model is used to predict the behavior of the physical model. The use of different insulating materials to encase the physical model allows the effects of differing leakage magnitudes to be examined. A procedure for determining appropriate leakage parameters is documented. These parameters are used in forward analysis to predict the thermal response of the physical model. Knowledge of the leakage parameters and of the temporal variation of boundary conditions are shown to be essential to an accurate prediction. Favorable agreement is illustrated between numerical and physical results. The physical model provides a data source for the benchmarking of alternative numerical algorithms.
Physical modeling of the effects of climate change on freshwater lenses
NASA Astrophysics Data System (ADS)
Stoeckl, L.; Houben, G.
2012-04-01
The investigation of the fragile equilibrium between fresh and saline water on oceanic islands is of major importance for a sustainable management and protection of freshwater lenses. Overexploitation will lead to salt water intrusion (up-coning), in turn causing damages or even destruction of a lens in the long term. We have performed a series of experiments on the laboratory scale to investigate and visualize processes of freshwater lenses under different boundary conditions. In addition these scenarios were numerically simulated using the finite-element model FEFLOW. Results were also compared to analytical solutions for problems regarding e.g. mean travel times of flow paths within a freshwater lens. On the laboratory scale, a cross section of an island was simulated by setting up a sand-box model (200 cm x 50 cm x 5 cm). Lens dynamics are driven by density contrasts of saline and fresh water, recharge rate and Kf-values of the medium. We used a time-dependent, sequential application of the tracers uranine, eosine and indigotine, to represent different recharge events. With a stepwise increase of freshwater recharge, we could show that the maximum thickness of the lens increased in a non-linear behavior. Moreover we measured that the degradation of a freshwater lens after turning off the precipitation does not follow the same function as its development does. This means that a steady state freshwater lens does not degrade as fast as it develops under constant recharge. On the other side, we could show that this is not true for a partial degradation of the lens due to passing forces, like anthropogenic pumping or climate change. This is, because the recovery to equilibrium is always a quasi asymptotic process. Thus, times of re-equilibration to steady state will take longer after e.g. a drought, than the degradation during the draught itself. This behavior could also be verified applying the numerical finite-element model FEFLOW. In addition, numerical simulations will be used to close the gap between laboratory results and future field investigations. For example, impacts due to sea level rise induced by climate change can be up-scaled and compared to the results achieved from physical experiments. Analytical models (e.g. Fetter 1972, Vacher et al. 1990, Chesnaux & Allen 2007) were used as benchmarks in our investigations. Models in general are simplifications of a real situation trying to display the relevant processes. For further investigations it is planned to compare different models and generate new benchmark experiments to improve the accuracy of existing models.
Benchmark tests of JENDL-3.2 for thermal and fast reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takano, Hideki; Akie, Hiroshi; Kikuchi, Yasuyuki
1994-12-31
Benchmark calculations for a variety of thermal and fast reactors have been performed by using the newly evaluated JENDL-3 Version-2 (JENDL-3.2) file. In the thermal reactor calculations for the uranium and plutonium fueled cores of TRX and TCA, the k{sub eff} and lattice parameters were well predicted. The fast reactor calculations for ZPPR-9 and FCA assemblies showed that the k{sub eff} reactivity worths of Doppler, sodium void and control rod, and reaction rate distribution were in a very good agreement with the experiments.
The National Practice Benchmark for oncology, 2014 report on 2013 data.
Towle, Elaine L; Barr, Thomas R; Senese, James L
2014-11-01
The National Practice Benchmark (NPB) is a unique tool to measure oncology practices against others across the country in a way that allows meaningful comparisons despite differences in practice size or setting. In today's economic environment every oncology practice, regardless of business structure or affiliation, should be able to produce, monitor, and benchmark basic metrics to meet current business pressures for increased efficiency and efficacy of care. Although we recognize that the NPB survey results do not capture the experience of all oncology practices, practices that can and do participate demonstrate exceptional managerial capability, and this year those practices are recognized for their participation. In this report, we continue to emphasize the methodology introduced last year in which we reported medical revenue net of the cost of the drugs as net medical revenue for the hematology/oncology product line. The effect of this is to capture only the gross margin attributable to drugs as revenue. New this year, we introduce six measures of clinical data density and expand the radiation oncology benchmarks. Copyright © 2014 by American Society of Clinical Oncology.
Personalized recommendation based on preferential bidirectional mass diffusion
NASA Astrophysics Data System (ADS)
Chen, Guilin; Gao, Tianrun; Zhu, Xuzhen; Tian, Hui; Yang, Zhao
2017-03-01
Recommendation system provides a promising way to alleviate the dilemma of information overload. In physical dynamics, mass diffusion has been used to design effective recommendation algorithms on bipartite network. However, most of the previous studies focus overwhelmingly on unidirectional mass diffusion from collected objects to uncollected objects, while overlooking the opposite direction, leading to the risk of similarity estimation deviation and performance degradation. In addition, they are biased towards recommending popular objects which will not necessarily promote the accuracy but make the recommendation lack diversity and novelty that indeed contribute to the vitality of the system. To overcome the aforementioned disadvantages, we propose a preferential bidirectional mass diffusion (PBMD) algorithm by penalizing the weight of popular objects in bidirectional diffusion. Experiments are evaluated on three benchmark datasets (Movielens, Netflix and Amazon) by 10-fold cross validation, and results indicate that PBMD remarkably outperforms the mainstream methods in accuracy, diversity and novelty.
Progress in Electromagnetic Alteration of Nuclear Decay Properties
NASA Astrophysics Data System (ADS)
Casperson, R. J.; Hughes, R. O.; Burke, J. T.; Scielzo, N. D.; Soufli, R.
2014-03-01
Significant alteration of nuclear decay properties would have important consequences, ranging from novel approaches to nuclear batteries and gamma-ray lasers, to improved viability for physics experiments with short-lived targets. Quantum systems that decay by photon emission must couple to the electromagnetic modes of the local environment, and by modifying these modes, one can manipulate the rate of spontaneous emission. The nuclear isomer 235mU is low-energy, long-lived, and is easily populated through 239Pu α-decay, which makes it an excellent benchmark for this effect. The decay rate of this isomer in a variety of environments is currently under investigation. Implications of this work will be discussed, and first results will be presented. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Progress in Electromagnetic Alteration of Nuclear Decay Properties
NASA Astrophysics Data System (ADS)
Casperson, R. J.; Burke, J. T.; Hughes, R. O.; Scielzo, N. D.; Soufli, R.
2013-10-01
Significant alteration of nuclear decay properties would have important consequences, ranging from novel approaches to nuclear batteries and gamma-ray lasers, to improved viability for physics experiments with short-lived targets. Quantum systems that decay by photon emission must couple to the electromagnetic modes of the local environment, and by modifying these modes, one can manipulate the rate of spontaneous emission. The nuclear isomer 235mU is low-energy, long-lived, and is easily populated through 239Pu α-decay, which makes it an excellent benchmark for this effect. The decay rate of this isomer in a variety of environments is currently under investigation. Implications of this work will be discussed, and first results will be presented. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
A Test Apparatus for the MAJORANA DEMONSTRATOR Front-end Electronics
NASA Astrophysics Data System (ADS)
Singh, Harjit; Loach, James; Poon, Alan
2012-10-01
One of the most important experimental programs in neutrino physics is the search for neutrinoless double-beta decay. The MAJORANA collaboration is searching for this rare nuclear process in the Ge-76 isotope using HPGe detectors. Each detector is instrumented with high-performance electronics to read out and amplify the signals. The part of the electronics close to the detectors, consisting of a novel front-end circuit, cables and connectors, is made of radio-pure materials and is exceedingly delicate. In this work a dedicated test apparatus was created to benchmark the performance of the electronics before installation in the experiment. The apparatus was designed for cleanroom use, with fixtures to hold the components without contaminating them, and included the electronics necessary for power and readout. In addition to testing, the station will find longer term use in development of future versions of the electronics.
The spectral cell method in nonlinear earthquake modeling
NASA Astrophysics Data System (ADS)
Giraldo, Daniel; Restrepo, Doriam
2017-12-01
This study examines the applicability of the spectral cell method (SCM) to compute the nonlinear earthquake response of complex basins. SCM combines fictitious-domain concepts with the spectral-version of the finite element method to solve the wave equations in heterogeneous geophysical domains. Nonlinear behavior is considered by implementing the Mohr-Coulomb and Drucker-Prager yielding criteria. We illustrate the performance of SCM with numerical examples of nonlinear basins exhibiting physically and computationally challenging conditions. The numerical experiments are benchmarked with results from overkill solutions, and using MIDAS GTS NX, a finite element software for geotechnical applications. Our findings show good agreement between the two sets of results. Traditional spectral elements implementations allow points per wavelength as low as PPW = 4.5 for high-order polynomials. Our findings show that in the presence of nonlinearity, high-order polynomials (p ≥ 3) require mesh resolutions above of PPW ≥ 10 to ensure displacement errors below 10%.
Dong, Nianbo; Lipsey, Mark W
2017-01-01
It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.
The environment for women in physics in Ireland
NASA Astrophysics Data System (ADS)
McLoughlin, Eilish; Fee, Sandra; McCabe, Eithne
2015-12-01
Physics is contributing strongly to the national Irish economy, with 4.5% of the Irish workforce employed in physics-based or other science, technology, engineering, and math (STEM) sectors. However, a recent national report reveals that the proportion of women working in jobs that utilize STEM skills is less than 25% of the workforce. We present data collected from the views of 1,000 female secondary school students, young women (age 18-23), secondary-school teachers and parents on what influences secondary school students' choices of subjects and in particular STEM-related subjects. In addition, benchmarking data on female student and staff ratios for the past five years is presented from all seven Irish university physics departments.
Inter-model analysis of tsunami-induced coastal currents
NASA Astrophysics Data System (ADS)
Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.
2017-06-01
To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.
ERIC Educational Resources Information Center
McConeghy, Kevin; Wing, Coady; Wong, Vivian C.
2015-01-01
Randomized experiments have long been established as the gold standard for addressing causal questions. However, experiments are not always feasible or desired, so observational methods are also needed. When multiple observations on the same variable are available, a repeated measures design may be used to assess whether a treatment administered…
Benchmark Eye Movement Effects during Natural Reading in Autism Spectrum Disorder
ERIC Educational Resources Information Center
Howard, Philippa L.; Liversedge, Simon P.; Benson, Valerie
2017-01-01
In 2 experiments, eye tracking methodology was used to assess on-line lexical, syntactic and semantic processing in autism spectrum disorder (ASD). In Experiment 1, lexical identification was examined by manipulating the frequency of target words. Both typically developed (TD) and ASD readers showed normal frequency effects, suggesting that the…
Precision Spectroscopy of Molecular Hydrogen and the Search for New Physics
NASA Astrophysics Data System (ADS)
Ubachs, Wim
2017-06-01
The hydrogen molecule is the smallest neutral chemical entity and a benchmark system of molecular spectroscopy. The comparison between highly accurate measurements of transition frequencies and level energies with quantum calculations including all known phenomena (relativistic, vacuum polarization and self energy) provides a tool to search for physical phenomena in the realm of the unknown: are there forces beyond the three included in the Standard Model of physics plus gravity [1], are there extra dimensions beyond the 3+1 describing space time [2] ? Comparison of laboratory wavelengths of transitions in hydrogen may be compared with the lines observed during the epoch of the early Universe to verify whether fundamental constants of Nature have varied over cosmological time [3]. These concepts, as well as the precision laboratory experiments and the astronomical observations used for such searches of new physics [4] will be discussed. [1] E.J. Salumbides, J.C.J. Koelemeij, J. Komasa, K. Pachucki, K.S.E. Eikema, W. Ubachs, Bounds on fifth forces from precision measurements on molecules, Phys. Rev. D87, 112008 (2013). [2] E.J. Salumbides, A.N. Schellekens, B. Gato-Rivera, W. Ubachs Constraints on extra dimensions from molecular spectroscopy, New. J. Phys. 17, 033015 (2015). [3] W. Ubachs, J. Bagdonaite, E.J. Salumbides, M.T. Murphy, L. Kaper, Search for a drifting proton-electron mass ratio from H_2, Rev. Mod. Phys. 88, 021003 (2016). [4] W. Ubachs, J.C.J. Koelemeij, K.S.E. Eikema, E.J. Salumbides, Physics beyond the Standard Model from hydrogen spectroscopy, J. Mol. Spectr. 320, 1 (2016).
Guthrie, Nicole; Bradlyn, Andrew; Thompson, Sharon K; Yen, Sophia; Haritatos, Jana; Dillon, Fred; Cole, Steve W
2015-01-01
Most adolescents do not achieve the recommended levels of moderate-to-vigorous physical activity (MVPA), placing them at increased risk for a diverse array of chronic diseases in adulthood. There is a great need for scalable and effective interventions that can increase MVPA in adolescents. Here we report the results of a measurement validation study and a preliminary proof-of-concept experiment testing the impact of Zamzee, an accelerometer-linked online intervention system that combines proximal performance feedback and incentive motivation features to promote MVPA. In a calibration study that parametrically varied levels of physical activity in 31 12-14 year-old children, the Zamzee activity meter was shown to provide a valid measure of MVPA (sensitivity in detecting MVPA = 85.9%, specificity = 97.5%, and r = .94 correspondence with the benchmark RT3 accelerometer system; all p < .0001). In a subsequent randomized controlled multi-site experiment involving 182 middle school-aged children assessed for MVPA over 6 wks, intent-to-treat analyses found that those who received access to the Zamzee intervention had average MVPA levels 54% greater than those of a passive control group (p < 0.0001) and 68% greater than those of an active control group that received access to a commercially available active videogame (p < .0001). Zamzee's effects on MVPA did not diminish significantly over the course of the 6-wk study period, and were statistically significant in both females and males, and in normal- vs. high-BMI subgroups. These results provide promising initial indications that combining the Zamzee activity meter with online proximal performance feedback and incentive motivation features can positively impact MVPA levels in adolescents.
Guthrie, Nicole; Bradlyn, Andrew; Thompson, Sharon K.; Yen, Sophia; Haritatos, Jana; Dillon, Fred; Cole, Steve W.
2015-01-01
Most adolescents do not achieve the recommended levels of moderate-to-vigorous physical activity (MVPA), placing them at increased risk for a diverse array of chronic diseases in adulthood. There is a great need for scalable and effective interventions that can increase MVPA in adolescents. Here we report the results of a measurement validation study and a preliminary proof-of-concept experiment testing the impact of Zamzee, an accelerometer-linked online intervention system that combines proximal performance feedback and incentive motivation features to promote MVPA. In a calibration study that parametrically varied levels of physical activity in 31 12-14 year-old children, the Zamzee activity meter was shown to provide a valid measure of MVPA (sensitivity in detecting MVPA = 85.9%, specificity = 97.5%, and r = .94 correspondence with the benchmark RT3 accelerometer system; all p < .0001). In a subsequent randomized controlled multi-site experiment involving 182 middle school-aged children assessed for MVPA over 6 wks, intent-to-treat analyses found that those who received access to the Zamzee intervention had average MVPA levels 54% greater than those of a passive control group (p < 0.0001) and 68% greater than those of an active control group that received access to a commercially available active videogame (p < .0001). Zamzee’s effects on MVPA did not diminish significantly over the course of the 6-wk study period, and were statistically significant in both females and males, and in normal- vs. high-BMI subgroups. These results provide promising initial indications that combining the Zamzee activity meter with online proximal performance feedback and incentive motivation features can positively impact MVPA levels in adolescents. PMID:26010359
FDNS CFD Code Benchmark for RBCC Ejector Mode Operation: Continuing Toward Dual Rocket Effects
NASA Technical Reports Server (NTRS)
West, Jeff; Ruf, Joseph H.; Turner, James E. (Technical Monitor)
2000-01-01
Computational Fluid Dynamics (CFD) analysis results are compared with benchmark quality test data from the Propulsion Engineering Research Center's (PERC) Rocket Based Combined Cycle (RBCC) experiments to verify fluid dynamic code and application procedures. RBCC engine flowpath development will rely on CFD applications to capture the multi -dimensional fluid dynamic interactions and to quantify their effect on the RBCC system performance. Therefore, the accuracy of these CFD codes must be determined through detailed comparisons with test data. The PERC experiments build upon the well-known 1968 rocket-ejector experiments of Odegaard and Stroup by employing advanced optical and laser based diagnostics to evaluate mixing and secondary combustion. The Finite Difference Navier Stokes (FDNS) code [2] was used to model the fluid dynamics of the PERC RBCC ejector mode configuration. Analyses were performed for the Diffusion and Afterburning (DAB) test conditions at the 200-psia thruster operation point, Results with and without downstream fuel injection are presented.
NASA Technical Reports Server (NTRS)
Lockard, David P.
2011-01-01
Fifteen submissions in the tandem cylinders category of the First Workshop on Benchmark problems for Airframe Noise Computations are summarized. Although the geometry is relatively simple, the problem involves complex physics. Researchers employed various block-structured, overset, unstructured and embedded Cartesian grid techniques and considerable computational resources to simulate the flow. The solutions are compared against each other and experimental data from 2 facilities. Overall, the simulations captured the gross features of the flow, but resolving all the details which would be necessary to compute the noise remains challenging. In particular, how to best simulate the effects of the experimental transition strip, and the associated high Reynolds number effects, was unclear. Furthermore, capturing the spanwise variation proved difficult.
Benchmark solution of the dynamic response of a spherical shell at finite strain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Versino, Daniele; Brock, Jerry S.
2016-09-28
Our paper describes the development of high fidelity solutions for the study of homogeneous (elastic and inelastic) spherical shells subject to dynamic loading and undergoing finite deformations. The goal of the activity is to provide high accuracy results that can be used as benchmark solutions for the verification of computational physics codes. Furthermore, the equilibrium equations for the geometrically non-linear problem are solved through mode expansion of the displacement field and the boundary conditions are enforced in a strong form. Time integration is performed through high-order implicit Runge–Kutta schemes. Finally, we evaluate accuracy and convergence of the proposed method bymore » means of numerical examples with finite deformations and material non-linearities and inelasticity.« less
MC21 analysis of the MIT PWR benchmark: Hot zero power results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly Iii, D. J.; Aviles, B. N.; Herman, B. R.
2013-07-01
MC21 Monte Carlo results have been compared with hot zero power measurements from an operating pressurized water reactor (PWR), as specified in a new full core PWR performance benchmark from the MIT Computational Reactor Physics Group. Included in the comparisons are axially integrated full core detector measurements, axial detector profiles, control rod bank worths, and temperature coefficients. Power depressions from grid spacers are seen clearly in the MC21 results. Application of Coarse Mesh Finite Difference (CMFD) acceleration within MC21 has been accomplished, resulting in a significant reduction of inactive batches necessary to converge the fission source. CMFD acceleration has alsomore » been shown to work seamlessly with the Uniform Fission Site (UFS) variance reduction method. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maekawa, Fujio; Meigo, Shin-ichiro; Kasugai, Yoshimi
2005-05-15
A neutronic benchmark experiment on a simulated spallation neutron target assembly was conducted by using the Alternating Gradient Synchrotron at Brookhaven National Laboratory and was analyzed to investigate the prediction capability of Monte Carlo simulation codes used in neutronic designs of spallation neutron sources. The target assembly consisting of a mercury target, a light water moderator, and a lead reflector was bombarded by 1.94-, 12-, and 24-GeV protons, and the fast neutron flux distributions around the target and the spectra of thermal neutrons leaking from the moderator were measured in the experiment. In this study, the Monte Carlo particle transportmore » simulation codes NMTC/JAM, MCNPX, and MCNP-4A with associated cross-section data in JENDL and LA-150 were verified based on benchmark analysis of the experiment. As a result, all the calculations predicted the measured quantities adequately; calculated integral fluxes of fast and thermal neutrons agreed approximately within {+-}40% with the experiments although the overall energy range encompassed more than 12 orders of magnitude. Accordingly, it was concluded that these simulation codes and cross-section data were adequate for neutronics designs of spallation neutron sources.« less
Barry, Heather E; Campbell, John L; Asprey, Anthea; Richards, Suzanne H
2016-11-01
English National Quality Requirements mandate out-of-hours primary care services to routinely audit patient experience, but do not state how it should be done. We explored how providers collect patient feedback data and use it to inform service provision. We also explored staff views on the utility of out-of-hours questions from the English General Practice Patient Survey (GPPS). A qualitative study was conducted with 31 staff (comprising service managers, general practitioners and administrators) from 11 out-of-hours primary care providers in England, UK. Staff responsible for patient experience audits within their service were sampled and data collected via face-to-face semistructured interviews. Although most providers regularly audited their patients' experiences by using patient surveys, many participants expressed a strong preference for additional qualitative feedback. Staff provided examples of small changes to service delivery resulting from patient feedback, but service-wide changes were not instigated. Perceptions that patients lacked sufficient understanding of the urgent care system in which out-of-hours primary care services operate were common and a barrier to using feedback to enable change. Participants recognised the value of using patient experience feedback to benchmark services, but perceived weaknesses in the out-of-hours items from the GPPS led them to question the validity of using these data for benchmarking in its current form. The lack of clarity around how out-of-hours providers should audit patient experience hinders the utility of the National Quality Requirements. Although surveys were common, patient feedback data had only a limited role in service change. Data derived from the GPPS may be used to benchmark service providers, but refinement of the out-of-hours items is needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Technology for the Struggling Reader: Free and Easily Accessible Resources
ERIC Educational Resources Information Center
Berkeley, Sheri; Lindstrom, Jennifer H.
2011-01-01
A fundamental problem for many struggling readers, their parents, and their teachers is that there are few benchmarks to guide decision making about assistive technological supports when the nature of a disability is cognitive (e.g., specific learning disability, SLD) rather than physical. However, resources such as the National Center on…
Design of an instrument to measure the quality of care in Physical Therapy.
Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; Prado, Cristiane do; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo
2015-01-01
To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care.
Ali, F; Waker, A J; Waller, E J
2014-10-01
Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Colliders as a simultaneous probe of supersymmetric dark matter and Terascale cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barenboim, Gabriela; /Valencia U.; Lykken, Joseph D.
2006-08-01
Terascale supersymmetry has the potential to provide a natural explanation of the dominant dark matter component of the standard {Lambda}CDM cosmology. However once we impose the constraints on minimal supersymmetry parameters from current particle physics data, a satisfactory dark matter abundance is no longer prima facie natural. This Neutralino Tuning Problem could be a hint of nonstandard cosmology during and/or after the Terascale era. To quantify this possibility, we introduce an alternative cosmological benchmark based upon a simple model of quintessential inflation. This benchmark has no free parameters, so for a given supersymmetry model it allows an unambiguous prediction ofmore » the dark matter relic density. As a example, we scan over the parameter space of the CMSSM, comparing the neutralino relic density predictions with the bounds from WMAP. We find that the WMAP-allowed regions of the CMSSM are an order of magnitude larger if we use the alternative cosmological benchmark, as opposed to {Lambda}CDM. Initial results from the CERN Large Hadron Collider will distinguish between the two allowed regions.« less
Colliders as a simultaneous probe of supersymmetric dark matter and Terascale cosmology
NASA Astrophysics Data System (ADS)
Barenboim, Gabriela; Lykken, Joseph D.
2006-12-01
Terascale supersymmetry has the potential to provide a natural explanation of the dominant dark matter component of the standard ΛCDM cosmology. However once we impose the constraints on minimal supersymmetry parameters from current particle physics data, a satisfactory dark matter abundance is no longer prima facie natural. This Neutralino Tuning Problem could be a hint of nonstandard cosmology during and/or after the Terascale era. To quantify this possibility, we introduce an alternative cosmological benchmark based upon a simple model of quintessential inflation. This benchmark has no free parameters, so for a given supersymmetry model it allows an unambiguous prediction of the dark matter relic density. As a example, we scan over the parameter space of the CMSSM, comparing the neutralino relic density predictions with the bounds from WMAP. We find that the WMAP allowed regions of the CMSSM are an order of magnitude larger if we use the alternative cosmological benchmark, as opposed to ΛCDM. Initial results from the CERN Large Hadron Collider will distinguish between the two allowed regions.
The Star Schema Benchmark and Augmented Fact Table Indexing
NASA Astrophysics Data System (ADS)
O'Neil, Patrick; O'Neil, Elizabeth; Chen, Xuedong; Revilak, Stephen
We provide a benchmark measuring star schema queries retrieving data from a fact table with Where clause column restrictions on dimension tables. Clustering is crucial to performance with modern disk technology, since retrievals with filter factors down to 0.0005 are now performed most efficiently by sequential table search rather than by indexed access. DB2’s Multi-Dimensional Clustering (MDC) provides methods to "dice" the fact table along a number of orthogonal "dimensions", but only when these dimensions are columns in the fact table. The diced cells cluster fact rows on several of these "dimensions" at once so queries restricting several such columns can access crucially localized data, with much faster query response. Unfortunately, columns of dimension tables of a star schema are not usually represented in the fact table. In this paper, we show a simple way to adjoin physical copies of dimension columns to the fact table, dicing data to effectively cluster query retrieval, and explain how such dicing can be achieved on database products other than DB2. We provide benchmark measurements to show successful use of this methodology on three commercial database products.
Cosmological Parameters from the QUAD CMB Polarization Experiment
NASA Astrophysics Data System (ADS)
Castro, P. G.; Ade, P.; Bock, J.; Bowden, M.; Brown, M. L.; Cahill, G.; Church, S.; Culverhouse, T.; Friedman, R. B.; Ganga, K.; Gear, W. K.; Gupta, S.; Hinderks, J.; Kovac, J.; Lange, A. E.; Leitch, E.; Melhuish, S. J.; Memari, Y.; Murphy, J. A.; Orlando, A.; Pryke, C.; Schwarz, R.; O'Sullivan, C.; Piccirillo, L.; Rajguru, N.; Rusholme, B.; Taylor, A. N.; Thompson, K. L.; Turner, A. H.; Wu, E. Y. S.; Zemcov, M.; QUa D Collaboration
2009-08-01
In this paper, we present a parameter estimation analysis of the polarization and temperature power spectra from the second and third season of observations with the QUaD experiment. QUaD has for the first time detected multiple acoustic peaks in the E-mode polarization spectrum with high significance. Although QUaD-only parameter constraints are not competitive with previous results for the standard six-parameter ΛCDM cosmology, they do allow meaningful polarization-only parameter analyses for the first time. In a standard six-parameter ΛCDM analysis, we find the QUaD TT power spectrum to be in good agreement with previous results. However, the QUaD polarization data show some tension with ΛCDM. The origin of this 1σ-2σ tension remains unclear, and may point to new physics, residual systematics, or simple random chance. We also combine QUaD with the five-year WMAP data set and the SDSS luminous red galaxies 4th data release power spectrum, and extend our analysis to constrain individual isocurvature mode fractions, constraining cold dark matter density, αcdmi < 0.11 (95% confidence limit (CL)), neutrino density, αndi < 0.26 (95% CL), and neutrino velocity, αnvi < 0.23 (95% CL), modes. Our analysis sets a benchmark for future polarization experiments.
A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems
NASA Astrophysics Data System (ADS)
Abtahi, Amir-Reza; Bijari, Afsane
2017-03-01
In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.
A note on bound constraints handling for the IEEE CEC'05 benchmark function suite.
Liao, Tianjun; Molina, Daniel; de Oca, Marco A Montes; Stützle, Thomas
2014-01-01
The benchmark functions and some of the algorithms proposed for the special session on real parameter optimization of the 2005 IEEE Congress on Evolutionary Computation (CEC'05) have played and still play an important role in the assessment of the state of the art in continuous optimization. In this article, we show that if bound constraints are not enforced for the final reported solutions, state-of-the-art algorithms produce infeasible best candidate solutions for the majority of functions of the IEEE CEC'05 benchmark function suite. This occurs even though the optima of the CEC'05 functions are within the specified bounds. This phenomenon has important implications on algorithm comparisons, and therefore on algorithm designs. This article's goal is to draw the attention of the community to the fact that some authors might have drawn wrong conclusions from experiments using the CEC'05 problems.
Experimental benchmark of kinetic simulations of capacitively coupled plasmas in molecular gases
NASA Astrophysics Data System (ADS)
Donkó, Z.; Derzsi, A.; Korolov, I.; Hartmann, P.; Brandt, S.; Schulze, J.; Berger, B.; Koepke, M.; Bruneau, B.; Johnson, E.; Lafleur, T.; Booth, J.-P.; Gibson, A. R.; O'Connell, D.; Gans, T.
2018-01-01
We discuss the origin of uncertainties in the results of numerical simulations of low-temperature plasma sources, focusing on capacitively coupled plasmas. These sources can be operated in various gases/gas mixtures, over a wide domain of excitation frequency, voltage, and gas pressure. At low pressures, the non-equilibrium character of the charged particle transport prevails and particle-based simulations become the primary tools for their numerical description. The particle-in-cell method, complemented with Monte Carlo type description of collision processes, is a well-established approach for this purpose. Codes based on this technique have been developed by several authors/groups, and have been benchmarked with each other in some cases. Such benchmarking demonstrates the correctness of the codes, but the underlying physical model remains unvalidated. This is a key point, as this model should ideally account for all important plasma chemical reactions as well as for the plasma-surface interaction via including specific surface reaction coefficients (electron yields, sticking coefficients, etc). In order to test the models rigorously, comparison with experimental ‘benchmark data’ is necessary. Examples will be given regarding the studies of electron power absorption modes in O2, and CF4-Ar discharges, as well as on the effect of modifications of the parameters of certain elementary processes on the computed discharge characteristics in O2 capacitively coupled plasmas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margaret A. Marshall
In the early 1970’s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an attempt to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950’s (HEU-MET-FAST-001). The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared withmore » the GODIVA I experiments. “The very accurate description of this sphere, as assembled, establishes it as an ideal benchmark for calculational methods and cross-section data files.” (Reference 1) While performing the ORSphere experiments care was taken to accurately document component dimensions (±0. 0001 in. for non-spherical parts), masses (±0.01 g), and material data The experiment was also set up to minimize the amount of structural material in the sphere proximity. A three part sphere was initially assembled with an average radius of 3.4665 in. and was then machined down to an average radius of 3.4420 in. (3.4425 in. nominal). These two spherical configurations were evaluated and judged to be acceptable benchmark experiments; however, the two experiments are highly correlated.« less
Benchmark gas core critical experiment.
NASA Technical Reports Server (NTRS)
Kunze, J. F.; Lofthouse, J. H.; Cooper, C. G.; Hyland, R. E.
1972-01-01
A critical experiment with spherical symmetry has been conducted on the gas core nuclear reactor concept. The nonspherical perturbations in the experiment were evaluated experimentally and produce corrections to the observed eigenvalue of approximately 1% delta k. The reactor consisted of a low density, central uranium hexafluoride gaseous core, surrounded by an annulus of void or low density hydrocarbon, which in turn was surrounded with a 97-cm-thick heavy water reflector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bezler, P.; Hartzman, M.; Reich, M.
1980-08-01
A set of benchmark problems and solutions have been developed for verifying the adequacy of computer programs used for dynamic analysis and design of nuclear piping systems by the Response Spectrum Method. The problems range from simple to complex configurations which are assumed to experience linear elastic behavior. The dynamic loading is represented by uniform support motion, assumed to be induced by seismic excitation in three spatial directions. The solutions consist of frequencies, participation factors, nodal displacement components and internal force and moment components. Solutions to associated anchor point motion static problems are not included.
Turbofan forced mixer-nozzle internal flowfield. Volume 1: A benchmark experimental study
NASA Technical Reports Server (NTRS)
Paterson, R. W.
1982-01-01
An experimental investigation of the flow field within a model turbofan forced mixer nozzle is described. Velocity and thermodynamic state variable data for use in assessing the accuracy and assisting the further development of computational procedures for predicting the flow field within mixer nozzles are provided. Velocity and temperature data suggested that the nozzle mixing process was dominated by circulations (secondary flows) of a length scale on the order the lobe dimensions which were associated with strong radial velocities observed near the lobe exit plane. The 'benchmark' model mixer experiment conducted for code assessment purposes is discussed.
Multitasking and microtasking experience on the NA S Cray-2 and ACF Cray X-MP
NASA Technical Reports Server (NTRS)
Raiszadeh, Farhad
1987-01-01
The fast Fourier transform (FFT) kernel of the NAS benchmark program has been utilized to experiment with the multitasking library on the Cray-2 and Cray X-MP/48, and microtasking directives on the Cray X-MP. Some performance figures are shown, and the state of multitasking software is described.
ERIC Educational Resources Information Center
Brandt Brecheisen, Shannon M.
2014-01-01
The purpose of this national, quantitative study was to (1) provide psychometrics for the ACUHO-I/EBI RA Survey, a joint project between between Educational Benchmarking, Inc (EBI) and The Association of College and University Housing Officers--International (ACUHO-I), and (2) explore the sophomore resident assistant (RA) experience. This study…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewald, E; Kozioziemski, B; Moody, J
2008-06-26
We use x-ray phase contrast imaging to characterize the inner surface roughness of DT ice layers in capsules planned for future ignition experiments. It is therefore important to quantify how well the x-ray data correlates with the actual ice roughness. We benchmarked the accuracy of our system using surrogates with fabricated roughness characterized with high precision standard techniques. Cylindrical artifacts with azimuthally uniform sinusoidal perturbations with 100 um period and 1 um amplitude demonstrated 0.02 um accuracy limited by the resolution of the imager and the source size of our phase contrast system. Spherical surrogates with random roughness close tomore » that required for the DT ice for a successful ignition experiment were used to correlate the actual surface roughness to that obtained from the x-ray measurements. When comparing average power spectra of individual measurements, the accuracy mode number limits of the x-ray phase contrast system benchmarked against surface characterization performed by Atomic Force Microscopy are 60 and 90 for surrogates smoother and rougher than the required roughness for the ice. These agreement mode number limits are >100 when comparing matching individual measurements. We will discuss the implications for interpreting DT ice roughness data derived from phase-contrast x-ray imaging.« less
Performance Evaluation and Benchmarking of Next Intelligent Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
del Pobil, Angel; Madhavan, Raj; Bonsignorio, Fabio
Performance Evaluation and Benchmarking of Intelligent Systems presents research dedicated to the subject of performance evaluation and benchmarking of intelligent systems by drawing from the experiences and insights of leading experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. This contributed volume offers a detailed and coherent picture of state-of-the-art, recent developments, and further research areas in intelligent systems. The chapters cover a broad range of applications, such as assistive robotics, planetary surveying, urban search and rescue, and line tracking for automotive assembly. Subsystems or components described in this bookmore » include human-robot interaction, multi-robot coordination, communications, perception, and mapping. Chapters are also devoted to simulation support and open source software for cognitive platforms, providing examples of the type of enabling underlying technologies that can help intelligent systems to propagate and increase in capabilities. Performance Evaluation and Benchmarking of Intelligent Systems serves as a professional reference for researchers and practitioners in the field. This book is also applicable to advanced courses for graduate level students and robotics professionals in a wide range of engineering and related disciplines including computer science, automotive, healthcare, manufacturing, and service robotics.« less
Using Microsoft Excel to Assess Standards: A "Techtorial". Article #2 in a 6-Part Series
ERIC Educational Resources Information Center
Mears, Derrick
2009-01-01
Standards-based assessment is a term currently being used quite often in educational reform discussions. The philosophy behind this initiative is to utilize "standards" or "benchmarks" to focus instruction and assessments of student learning. The National Standards for Physical Education (NASPE, 2004) provide a framework to guide this process for…
NASA Technical Reports Server (NTRS)
Hargis, W. J., Jr.
1981-01-01
The social and economic importance of estuaries are discussed. Major focus is on the Chesapeake Bay and its interaction with the adjacent waters of the Virginia Sea. Associated multiple use development and management problems as well as their internal physical, geological, chemical, and biological complexities are described.
Kids Count Data Book 1996: State Profiles of Child Well-Being.
ERIC Educational Resources Information Center
Annie E. Casey Foundation, Baltimore, MD.
This book provides a national and state-by-state (including the District of Columbia) compilation of benchmarks of the educational, social, economic, and physical well-being of children in the United States. Ten indicators of children's well-being are taken from government sources: (1) percent low birth-weight babies; (2) infant mortality rate;…
KIDS COUNT Data Book 1997: State Profiles of Child Well-Being.
ERIC Educational Resources Information Center
Annie E. Casey Foundation, Baltimore, MD.
This 1997 KIDS COUNT data book provides a national and state-by-state (including the District of Columbia) compilation of benchmarks of the educational, social, economic, and physical well-being of children in the United States. Ten indicators of children's well-being are taken from government sources: (1) percent of low birth-weight babies; (2)…
Assessment Competence through In Situ Practice for Preservice Educators
ERIC Educational Resources Information Center
Hurley, Kimberly S.
2018-01-01
Effective assessment is the cornerstone of the teaching and learning process and a benchmark of teaching competency. P-12 assessment in physical activity can be complex and dynamic, often requiring a set of skills developed over time through trial and error. Novice teachers have limited time to hone an assessment process that can showcase their…
Watkinson, William; Raison, Nicholas; Abe, Takashige; Harrison, Patrick; Khan, Shamim; Van der Poel, Henk; Dasgupta, Prokar; Ahmed, Kamran
2018-05-01
To establish objective benchmarks at the level of a competent robotic surgeon across different exercises and metrics for the RobotiX Mentor virtual reality (VR) simulator suitable for use within a robotic surgical training curriculum. This retrospective observational study analysed results from multiple data sources, all of which used the RobotiX Mentor VR simulator. 123 participants with varying experience from novice to expert completed the exercises. Competency was established as the 25th centile of the mean advanced intermediate score. Three basic skill exercises and two advanced skill exercises were used. King's College London. 84 Novice, 26 beginner intermediates, 9 advanced intermediates and 4 experts were used in this retrospective observational study. Objective benchmarks derived from the 25th centile of the mean scores of the advanced intermediates provided suitably challenging yet also achievable targets for training surgeons. The disparity in scores was greatest for the advanced exercises. Novice surgeons are able to achieve the benchmarks across all exercises in the majority of metrics. We have successfully created this proof-of-concept study, which requires validation in a larger cohort. Objective benchmarks obtained from the 25th centile of the mean scores of advanced intermediates provide clinically relevant benchmarks at the standard of a competent robotic surgeon that are challenging yet also attainable. That can be used within a VR training curriculum allowing participants to track and monitor their progress in a structured and progressional manner through five exercises. Providing clearly defined targets, ensuring that a universal training standard has been achieved across training surgeons. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander
2017-09-09
The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.
Food Recognition: A New Dataset, Experiments, and Results.
Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo
2017-05-01
We propose a new dataset for the evaluation of food recognition algorithms that can be used in dietary monitoring applications. Each image depicts a real canteen tray with dishes and foods arranged in different ways. Each tray contains multiple instances of food classes. The dataset contains 1027 canteen trays for a total of 3616 food instances belonging to 73 food classes. The food on the tray images has been manually segmented using carefully drawn polygonal boundaries. We have benchmarked the dataset by designing an automatic tray analysis pipeline that takes a tray image as input, finds the regions of interest, and predicts for each region the corresponding food class. We have experimented with three different classification strategies using also several visual descriptors. We achieve about 79% of food and tray recognition accuracy using convolutional-neural-networks-based features. The dataset, as well as the benchmark framework, are available to the research community.
All Entangled States can Demonstrate Nonclassical Teleportation.
Cavalcanti, Daniel; Skrzypczyk, Paul; Šupić, Ivan
2017-09-15
Quantum teleportation, the process by which Alice can transfer an unknown quantum state to Bob by using preshared entanglement and classical communication, is one of the cornerstones of quantum information. The standard benchmark for certifying quantum teleportation consists in surpassing the maximum average fidelity between the teleported and the target states that can be achieved classically. According to this figure of merit, not all entangled states are useful for teleportation. Here we propose a new benchmark that uses the full information available in a teleportation experiment and prove that all entangled states can implement a quantum channel which cannot be reproduced classically. We introduce the idea of nonclassical teleportation witness to certify if a teleportation experiment is genuinely quantum and discuss how to quantify this phenomenon. Our work provides new techniques for studying teleportation that can be immediately applied to certify the quality of quantum technologies.
CFD validation experiments for hypersonic flows
NASA Technical Reports Server (NTRS)
Marvin, Joseph G.
1992-01-01
A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.
Calculated criticality for sup 235 U/graphite systems using the VIM Monte Carlo code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, P.J.; Grasseschi, G.L.; Olsen, D.N.
1992-01-01
Calculations for highly enriched uranium and graphite systems gained renewed interest recently for the new production modular high-temperature gas-cooled reactor (MHTGR). Experiments to validate the physics calculations for these systems are being prepared for the Transient Reactor Test Facility (TREAT) reactor at Argonne National Laboratory (ANL-West) and in the Compact Nuclear Power Source facility at Los Alamos National Laboratory. The continuous-energy Monte Carlo code VIM, or equivalently the MCNP code, can utilize fully detailed models of the MHTGR and serve as benchmarks for the approximate multigroup methods necessary in full reactor calculations. Validation of these codes and their associated nuclearmore » data did not exist for highly enriched {sup 235}U/graphite systems. Experimental data, used in development of more approximate methods, dates back to the 1960s. The authors have selected two independent sets of experiments for calculation with the VIM code. The carbon-to-uranium (C/U) ratios encompass the range of 2,000, representative of the new production MHTGR, to the ratio of 10,000 in the fuel of TREAT. Calculations used the ENDF/B-V data.« less
Long-duration planar direct-drive hydrodynamics experiments on the NIF
NASA Astrophysics Data System (ADS)
Casner, A.; Mailliet, C.; Khan, S. F.; Martinez, D.; Izumi, N.; Kalantar, D.; Di Nicola, P.; Di Nicola, J. M.; Le Bel, E.; Igumenshchev, I.; Tikhonchuk, V. T.; Remington, B. A.; Masse, L.; Smalyuk, V. A.
2018-01-01
The advent of high-power lasers facilities such as the National Ignition Facility (NIF) and the laser megajoule provide unique platforms to study the physics of turbulent mixing flows in high energy density plasmas. We report here on the commissioning of a novel planar direct-drive platform on the NIF, which allows the acceleration of targets during 30 ns. Planar plastic samples were directly irradiated by 300-450 kJ of UV laser light (351 nm) and a very good planarity of the laser drive is demonstrated. No detrimental effect of imprint is observed in the case of these thick plastic targets (300 μm), which is beneficial for future academic experiments requesting similar irradiation conditions. The long-duration direct-drive (DD) platform is thereafter harnessed to study the ablative Rayleigh-Taylor instability (RTI) in DD. The growth of two-dimensional pre-imposed perturbations is quantified through time-resolved face-on x-ray radiography and used as a benchmark for radiative hydrocode simulations. The ablative RTI is then quantified in its highly nonlinear stage starting from intentionally large 3D imprinted broadband modulations. Two generations of bubble mergers is observed for the first time in DD, as a result of the unprecedented long laser acceleration.
New eye phantom for ophthalmic surgery
NASA Astrophysics Data System (ADS)
Fogli, Gessica; Orsi, Gianni; De Maria, Carmelo; Montemurro, Francesca; Palla, Michele; Rizzo, Stanislao; Vozzi, Giovanni
2014-06-01
In this work, we designed and realized a new phantom able to mimic the principal mechanical, rheological, and physical cues of the human eye and that can be used as a common benchmark to validate new surgical procedures, innovative vitrectomes, and as a training system for surgeons. This phantom, in particular its synthetic humor vitreous, had the aim of reproducing diffusion properties of the natural eye and can be used as a system to evaluate the pharmacokinetics of drugs and optimization of their dose, limiting animal experiments. The eye phantom was built layer-by-layer starting from the sclera up to the retina, using low cost and easy to process polymers. The validation of the phantom was carried out by mechanical characterization of each layer, by diffusion test with commercial drugs into a purposely developed apparatus, and finally by a team of ophthalmic surgeons. Experiments demonstrated that polycaprolactone, polydimethylsiloxane, and gelatin, properly prepared, are the best materials to mimic the mechanical properties of sclera, choroid, and retina, respectively. A polyvinyl alcohol-gelatin polymeric system is the best for mimicking the viscosity of the human humor vitreous, even if the bevacizumab half-life is lower than in the human eye.
NASA Astrophysics Data System (ADS)
Masti, Robert; Srinivasan, Bhuvana; King, Jacob; Stoltz, Peter; Hansen, David; Held, Eric
2017-10-01
Recent results from experiments and simulations of magnetically driven pulsed power liners have explored the role of early-time electrothermal instability in the evolution of the MRT (magneto-Rayleigh-Taylor) instability. Understanding the development of these instabilities can lead to potential stabilization mechanisms; thereby providing a significant role in the success of fusion concepts such as MagLIF (Magnetized Liner Inertial Fusion). For MagLIF the MRT instability is the most detrimental instability toward achieving fusion energy production. Experiments of high-energy density plasmas from wire-array implosions have shown the requirement for more advanced physics modeling than that of ideal magnetohydrodynamics. The overall focus of this project is on using a multi-fluid extended-MHD model with kinetic closures for thermal conductivity, resistivity, and viscosity. The extended-MHD model has been updated to include the SESAME equation-of-state tables and numerical benchmarks with this implementation will be presented. Simulations of MRT growth and evolution for MagLIF-relevant parameters will be presented using this extended-MHD model with the SESAME equation-of-state tables. This work is supported by the Department of Energy Office of Science under Grant Number DE-SC0016515.
NASA Astrophysics Data System (ADS)
Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu
2017-01-01
This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).
NASA Technical Reports Server (NTRS)
Radovcich, N. A.
1984-01-01
The design experience associated with a benchmark aeroelastic design of an out of production transport aircraft is discussed. Current work being performed on a high aspect ratio wing design is reported. The Preliminary Aeroelastic Design of Structures (PADS) system is briefly summarized and some operational aspects of generating the design in an automated aeroelastic design environment are discussed.
Robust performance of multiple tasks by a mobile robot
NASA Technical Reports Server (NTRS)
Beckerman, Martin; Barnett, Deanna L.; Dickens, Mike; Weisbin, Charles R.
1989-01-01
While there have been many successful mobile robot experiments, only a few papers have addressed issues pertaining to the range of applicability, or robustness, of robotic systems. The purpose of this paper is to report results of a series of benchmark experiments done to determine and quantify the robustness of an integrated hardware and software system of a mobile robot.
Present Status and Extensions of the Monte Carlo Performance Benchmark
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Petrovic, Bojan; Martin, William R.
2014-06-01
The NEA Monte Carlo Performance benchmark started in 2011 aiming to monitor over the years the abilities to perform a full-size Monte Carlo reactor core calculation with a detailed power production for each fuel pin with axial distribution. This paper gives an overview of the contributed results thus far. It shows that reaching a statistical accuracy of 1 % for most of the small fuel zones requires about 100 billion neutron histories. The efficiency of parallel execution of Monte Carlo codes on a large number of processor cores shows clear limitations for computer clusters with common type computer nodes. However, using true supercomputers the speedup of parallel calculations is increasing up to large numbers of processor cores. More experience is needed from calculations on true supercomputers using large numbers of processors in order to predict if the requested calculations can be done in a short time. As the specifications of the reactor geometry for this benchmark test are well suited for further investigations of full-core Monte Carlo calculations and a need is felt for testing other issues than its computational performance, proposals are presented for extending the benchmark to a suite of benchmark problems for evaluating fission source convergence for a system with a high dominance ratio, for coupling with thermal-hydraulics calculations to evaluate the use of different temperatures and coolant densities and to study the correctness and effectiveness of burnup calculations. Moreover, other contemporary proposals for a full-core calculation with realistic geometry and material composition will be discussed.
A new numerical benchmark for variably saturated variable-density flow and transport in porous media
NASA Astrophysics Data System (ADS)
Guevara, Carlos; Graf, Thomas
2016-04-01
In subsurface hydrological systems, spatial and temporal variations in solute concentration and/or temperature may affect fluid density and viscosity. These variations could lead to potentially unstable situations, in which a dense fluid overlies a less dense fluid. These situations could produce instabilities that appear as dense plume fingers migrating downwards counteracted by vertical upwards flow of freshwater (Simmons et al., Transp. Porous Medium, 2002). As a result of unstable variable-density flow, solute transport rates are increased over large distances and times as compared to constant-density flow. The numerical simulation of variable-density flow in saturated and unsaturated media requires corresponding benchmark problems against which a computer model is validated (Diersch and Kolditz, Adv. Water Resour, 2002). Recorded data from a laboratory-scale experiment of variable-density flow and solute transport in saturated and unsaturated porous media (Simmons et al., Transp. Porous Medium, 2002) is used to define a new numerical benchmark. The HydroGeoSphere code (Therrien et al., 2004) coupled with PEST (www.pesthomepage.org) are used to obtain an optimized parameter set capable of adequately representing the data set by Simmons et al., (2002). Fingering in the numerical model is triggered using random hydraulic conductivity fields. Due to the inherent randomness, a large number of simulations were conducted in this study. The optimized benchmark model adequately predicts the plume behavior and the fate of solutes. This benchmark is useful for model verification of variable-density flow problems in saturated and/or unsaturated media.
Numerical benchmarking of a Coarse-Mesh Transport (COMET) Method for medical physics applications
NASA Astrophysics Data System (ADS)
Blackburn, Megan Satterfield
2009-12-01
Radiation therapy has become a very import method for treating cancer patients. Thus, it is extremely important to accurately determine the location of energy deposition during these treatments, maximizing dose to the tumor region and minimizing it to healthy tissue. A Coarse-Mesh Transport Method (COMET) has been developed at the Georgia Institute of Technology in the Computational Reactor and Medical Physics Group for use very successfully with neutron transport to analyze whole-core criticality. COMET works by decomposing a large, heterogeneous system into a set of smaller fixed source problems. For each unique local problem that exists, a solution is obtained that we call a response function. These response functions are pre-computed and stored in a library for future use. The overall solution to the global problem can then be found by a linear superposition of these local problems. This method has now been extended to the transport of photons and electrons for use in medical physics problems to determine energy deposition from radiation therapy treatments. The main goal of this work was to develop benchmarks for testing in order to evaluate the COMET code to determine its strengths and weaknesses for these medical physics applications. For response function calculations, legendre polynomial expansions are necessary for space, angle, polar angle, and azimuthal angle. An initial sensitivity study was done to determine the best orders for future testing. After the expansion orders were found, three simple benchmarks were tested: a water phantom, a simplified lung phantom, and a non-clinical slab phantom. Each of these benchmarks was decomposed into 1cm x 1cm and 0.5cm x 0.5cm coarse meshes. Three more clinically relevant problems were developed from patient CT scans. These benchmarks modeled a lung patient, a prostate patient, and a beam re-entry situation. As before, the problems were divided into 1cm x 1cm, 0.5cm x 0.5cm, and 0.25cm x 0.25cm coarse mesh cases. Multiple beam energies were also tested for each case. The COMET solutions for each case were compared to a reference solution obtained by pure Monte Carlo results from EGSnrc. When comparing the COMET results to the reference cases, a pattern of differences appeared in each phantom case. It was found that better results were obtained for lower energy incident photon beams as well as for larger mesh sizes. Possible changes may need to be made with the expansion orders used for energy and angle to better model high energy secondary electrons. Heterogeneity also did not pose a problem for the COMET methodology. Heterogeneous results were found in a comparable amount of time to the homogeneous water phantom. The COMET results were typically found in minutes to hours of computational time, whereas the reference cases typically required hundreds or thousands of hours. A second sensitivity study was also performed on a more stringent problem and with smaller coarse meshes. Previously, the same expansion order was used for each incident photon beam energy so better comparisons could be made. From this second study, it was found that it is optimal to have different expansion orders based on the incident beam energy. Recommendations for future work with this method include more testing on higher expansion orders or possible code modification to better handle secondary electrons. The method also needs to handle more clinically relevant beam descriptions with an energy and angular distribution associated with it.
Smoking cessation and counseling: knowledge and views of Canadian physical therapists.
Bodner, Michael E; Miller, William C; Rhodes, Ryan E; Dean, Elizabeth
2011-07-01
Physical therapists are uniquely positioned in health care to initiate or support smoking cessation (SC). Little is known, however, about their knowledge and views of SC as part of their practices. Objective The purpose of this study was to assess Canadian physical therapists' knowledge about the health effects of smoking, their views about addressing SC in practice, and their self-efficacy in enabling patients to quit smoking. Design This study was a cross-sectional survey. Licensed physical therapists in Canada were surveyed with postal methods. A total of 738 survey questionnaires were returned. The mean age and years of clinical experience of the respondents were 41.9 (SD=10.8) years and 17.4 (SD=11.0), respectively. Most respondents (78.6%) were women. Canadian physical therapists are largely informed about the negative effects of smoking on health. Although 76.9% of the physical therapists agreed or strongly agreed that the profession should be more involved in helping people who smoke quit, only 56.8% of the physical therapists agreed or strongly agreed that they should receive training on SC. More than 70% of the physical therapists reported that they were not prepared to provide counseling and, overall, the level of self-efficacy regarding counseling about SC was low. Lack of resources and time were reported to be key barriers to counseling patients to quit smoking. Limitations The findings of this study are limited to Canadian physical therapists. Response bias and social desirability bias also are potential limiters in this study. Overall, the majority of physical therapists expressed the view that advising people who smoke to quit is a clinical responsibility and endorsed greater involvement of the profession in helping people who smoke quit. Discordance existed, however, between these views and the physical therapists' interest in receiving training on counseling about SC. This is a benchmark study that has practical implications for targeting training consistent with the profession's mission to improve health by increasing physical therapists' preparedness and self-efficacy regarding counseling about SC.
Smoking Cessation and Counseling: Knowledge and Views of Canadian Physical Therapists
Miller, William C.; Rhodes, Ryan E.; Dean, Elizabeth
2011-01-01
Background Physical therapists are uniquely positioned in health care to initiate or support smoking cessation (SC). Little is known, however, about their knowledge and views of SC as part of their practices. Objective The purpose of this study was to assess Canadian physical therapists' knowledge about the health effects of smoking, their views about addressing SC in practice, and their self-efficacy in enabling patients to quit smoking. Design This study was a cross-sectional survey. Methods Licensed physical therapists in Canada were surveyed with postal methods. Results A total of 738 survey questionnaires were returned. The mean age and years of clinical experience of the respondents were 41.9 (SD=10.8) years and 17.4 (SD=11.0), respectively. Most respondents (78.6%) were women. Canadian physical therapists are largely informed about the negative effects of smoking on health. Although 76.9% of the physical therapists agreed or strongly agreed that the profession should be more involved in helping people who smoke quit, only 56.8% of the physical therapists agreed or strongly agreed that they should receive training on SC. More than 70% of the physical therapists reported that they were not prepared to provide counseling and, overall, the level of self-efficacy regarding counseling about SC was low. Lack of resources and time were reported to be key barriers to counseling patients to quit smoking. Limitations The findings of this study are limited to Canadian physical therapists. Response bias and social desirability bias also are potential limiters in this study. Conclusions Overall, the majority of physical therapists expressed the view that advising people who smoke to quit is a clinical responsibility and endorsed greater involvement of the profession in helping people who smoke quit. Discordance existed, however, between these views and the physical therapists' interest in receiving training on counseling about SC. This is a benchmark study that has practical implications for targeting training consistent with the profession's mission to improve health by increasing physical therapists' preparedness and self-efficacy regarding counseling about SC. PMID:21546565
Information System Implementation: Benchmarking the Stages.
ERIC Educational Resources Information Center
Calbos, Dennis P.
1984-01-01
The evolution of administrative data processing systems at the University of Georgia is summarized. Nolan's revised stage model was used as a framework to present the university's experience and relate it to the growing body of system implementation research. (Author/MLW)
Code of Federal Regulations, 2014 CFR
2014-10-01
... adjustments made pursuant to the benchmark standards described in § 156.110 of this subchapter. Benefit design... this subchapter. Enrollee satisfaction survey vendor means an organization that has relevant survey administration experience (for example, CAHPS® surveys), organizational survey capacity, and quality control...
NASA Astrophysics Data System (ADS)
Hess, Alexander Jay
Science and agriculture professional organizations have argued for agricultural literacy as a goal for K-12 public education. Due to the complexity of our modern agri-food system, with social, economic, and environmental concerns embedded, an agriculturally literate society is needed for informed decision making, democratic participation, and system reform. While grade-span specific benchmarks for gauging agri-food system literacy have been developed, little attention has been paid to existing ideas individuals hold about the agri-food system, how these existing ideas relate to benchmarks, how experience shapes such ideas, or how ideas change overtime. Developing a body of knowledge on students' agri-food system understandings as they develop across K-12 grades can ground efforts seeking to promote a learning progression toward agricultural literacy. This study compares existing perceptions held by 18 upper elementary students from a large urban center in California to agri-food system literacy benchmarks and examines the perceptions against student background and experiences. Data were collected via semi-structured interviews and analyzed using the constant comparative method. Constructivist theoretical perspectives framed the study. No student had ever grown their own food, raised a plant, or cared for an animal. Participation in school fieldtrips to farms or visits to a relative's garden were agricultural experiences most frequently mentioned. Students were able to identify common food items, but could not elaborate on their origins, especially those that were highly processed. Students' understanding of post-production activities (i.e. food processing, manufacturing, or food marketing) was not apparent. Students' understanding of farms reflected a 1900's subsistence farming operation commonly found in a literature written for the primary grades. Students were unaware that plants and animals were selected for production based on desired genetic traits. Obtaining food from areas with favorable growing conditions and supporting technology (such as transportation and refrigeration) was an understanding lacking in the group. Furthermore, most spoilage prevention technologies employed today were not an expressed part of student's schema. Students' backgrounds and experiences did not appear to support the development of a robust agri-food system schema. An agricultural science and technology schema appears poorly developed in each of the students.
Exploring theory space with Monte Carlo reweighting
Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.; ...
2014-10-13
Theories of new physics often involve a large number of unknown parameters which need to be scanned. Additionally, a putative signal in a particular channel may be due to a variety of distinct models of new physics. This makes experimental attempts to constrain the parameter space of motivated new physics models with a high degree of generality quite challenging. We describe how the reweighting of events may allow this challenge to be met, as fully simulated Monte Carlo samples generated for arbitrary benchmark models can be effectively re-used. Specifically, we suggest procedures that allow more efficient collaboration between theorists andmore » experimentalists in exploring large theory parameter spaces in a rigorous way at the LHC.« less
Search for new physics with a monojet and missing transverse energy in pp collisions at √s = 7 TeV.
Chatrchyan, S; Khachatryan, V; Sirunyan, A M; Tumasyan, A; Adam, W; Bergauer, T; Dragicevic, M; Erö, J; Fabjan, C; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kiesenhofer, W; Krammer, M; Liko, D; Mikulec, I; Pernicka, M; Rohringer, H; Schöfbeck, R; Strauss, J; Taurok, A; Teischinger, F; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C-E; Mossolov, V; Shumeiko, N; Gonzalez, J Suarez; Bansal, S; Benucci, L; De Wolf, E A; Janssen, X; Maes, J; Maes, T; Mucibello, L; Ochesanu, S; Roland, B; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Blekman, F; Blyweert, S; D'Hondt, J; Devroede, O; Suarez, R Gonzalez; Kalogeropoulos, A; Maes, M; Van Doninck, W; Van Mulders, P; Van Onsem, G P; Villella, I; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Gay, A P R; Hammad, G H; Hreus, T; Marage, P E; Thomas, L; Velde, C Vander; Vanlaer, P; Adler, V; Cimmino, A; Costantini, S; Grunewald, M; Klein, B; Lellouch, J; Marinov, A; Mccartin, J; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Walsh, S; Zaganidis, N; Basegmez, S; Bruno, G; Caudron, J; Ceard, L; Gil, E Cortina; De Favereau De Jeneret, J; Delaere, C; Favart, D; Giammanco, A; Grégoire, G; Hollar, J; Lemaitre, V; Liao, J; Militaru, O; Nuttens, C; Ovyn, S; Pagano, D; Pin, A; Piotrzkowski, K; Schul, N; Beliy, N; Caebergs, T; Daubie, E; Alves, G A; Brito, L; De Jesus Damiao, D; Pol, M E; Souza, M H G; Aldá Júnior, W L; Carvalho, W; Da Costa, E M; Martins, C De Oliveira; Fonseca De Souza, S; Mundim, L; Nogima, H; Oguri, V; Prado Da Silva, W L; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Bernardes, C A; Dias, F A; Tomei, T R Fernandez Perez; Gregores, E M; Lagana, C; Marinho, F; Mercadante, P G; Novaes, S F; Padula, Sandra S; Darmenov, N; Genchev, V; Iaydjiev, P; Piperov, S; Rodozov, M; Stoykova, S; Sultanov, G; Tcholakov, V; Trayanov, R; Dimitrov, A; Hadjiiska, R; Karadzhinova, A; Kozhuharov, V; Litov, L; Mateev, M; Pavlov, B; Petkov, P; Bian, J G; Chen, G M; Chen, H S; Jiang, C H; Liang, D; Liang, S; Meng, X; Tao, J; Wang, J; Wang, J; Wang, X; Wang, Z; Xiao, H; Xu, M; Zang, J; Zhang, Z; Ban, Y; Guo, S; Guo, Y; Li, W; Mao, Y; Qian, S J; Teng, H; Zhu, B; Zou, W; Cabrera, A; Moreno, B Gomez; Rios, A A Ocampo; Oliveros, A F Osorio; Sanabria, J C; Godinovic, N; Lelas, D; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Attikis, A; Galanti, M; Mousa, J; Nicolaou, C; Ptochos, F; Razis, P A; Finger, M; Finger, M; Awad, A; Khalil, S; Radi, A; Hektor, A; Kadastik, M; Müntel, M; Raidal, M; Rebane, L; Tiko, A; Azzolini, V; Eerola, P; Fedi, G; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Tuominen, E; Tuominiemi, J; Tuovinen, E; Ungaro, D; Wendland, L; Banzuzi, K; Karjalainen, A; Korpela, A; Tuuva, T; Sillou, D; Besancon, M; Choudhury, S; Dejardin, M; Denegri, D; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Shreyber, I; Titov, M; Verrecchia, P; Baffioni, S; Beaudette, F; Benhabib, L; Bianchini, L; Bluj, M; Broutin, C; Busson, P; Charlot, C; Dahms, T; Dobrzynski, L; Elgammal, S; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Mironov, C; Ochando, C; Paganini, P; Sabes, D; Salerno, R; Sirois, Y; Thiebaux, C; Wyslouch, B; Zabi, A; Agram, J-L; Andrea, J; Bloch, D; Bodin, D; Brom, J-M; Cardaci, M; Chabert, E C; Collard, C; Conte, E; Drouhin, F; Ferro, C; Fontaine, J-C; Gelé, D; Goerlach, U; Greder, S; Juillot, P; Karim, M; Le Bihan, A-C; Mikami, Y; Van Hove, P; Fassi, F; Mercier, D; Baty, C; Beauceron, S; Beaupere, N; Bedjidian, M; Bondu, O; Boudoul, G; Boumediene, D; Brun, H; Chasserat, J; Chierici, R; Contardo, D; Depasse, P; El Mamouni, H; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Mirabito, L; Perries, S; Sordini, V; Tosi, S; Tschudi, Y; Verdier, P; Lomidze, D; Anagnostou, G; Beranek, S; Edelhoff, M; Feld, L; Heracleous, N; Hindrichs, O; Jussen, R; Klein, K; Merz, J; Mohr, N; Ostapchuk, A; Perieanu, A; Raupach, F; Sammet, J; Schael, S; Sprenger, D; Weber, H; Weber, M; Wittmer, B; Ata, M; Dietz-Laursonn, E; Erdmann, M; Hebbeker, T; Hinzmann, A; Hoepfner, K; Klimkovich, T; Klingebiel, D; Kreuzer, P; Lanske, D; Lingemann, J; Magass, C; Merschmeyer, M; Meyer, A; Papacz, P; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Steggemann, J; Teyssier, D; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Ahmad, W Haj; Heydhausen, D; Hoehle, F; Kargoll, B; Kress, T; Kuessel, Y; Linn, A; Nowack, A; Perchalla, L; Pooth, O; Rennefeld, J; Sauerland, P; Stahl, A; Thomas, M; Tornier, D; Zoeller, M H; Martin, M Aldaya; Behrenhoff, W; Behrens, U; Bergholz, M; Bethani, A; Borras, K; Cakir, A; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Eckstein, D; Flossdorf, A; Flucke, G; Geiser, A; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Katsas, P; Kleinwort, C; Kluge, H; Knutsson, A; Krämer, M; Krücker, D; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Melzer-Pellmann, I-A; Meyer, A B; Mnich, J; Mussgiller, A; Olzem, J; Petrukhin, A; Pitzl, D; Raspereza, A; Raval, A; Rosin, M; Schmidt, R; Schoerner-Sadenius, T; Sen, N; Spiridonov, A; Stein, M; Tomaszewska, J; Walsh, R; Wissing, C; Autermann, C; Blobel, V; Bobrovskyi, S; Draeger, J; Enderle, H; Gebbert, U; Görner, M; Kaschube, K; Kaussen, G; Kirschenmann, H; Klanner, R; Lange, J; Mura, B; Naumann-Emme, S; Nowak, F; Pietsch, N; Sander, C; Schettler, H; Schleper, P; Schlieckau, E; Schröder, M; Schum, T; Schwandt, J; Stadie, H; Steinbrück, G; Thomsen, J; Barth, C; Bauer, J; Berger, J; Buege, V; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Gruschke, J; Hackstein, C; Hartmann, F; Heinrich, M; Held, H; Hoffmann, K H; Honc, S; Komaragiri, J R; Kuhr, T; Martschei, D; Mueller, S; Müller, Th; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Scheurer, A; Schieferdecker, P; Schilling, F-P; Schott, G; Simonis, H J; Stober, F M; Troendle, D; Wagner-Kuhr, J; Weiler, T; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Kesisoglou, S; Kyriakis, A; Loukas, D; Manolakos, I; Markou, A; Markou, C; Mavrommatis, C; Ntomari, E; Petrakou, E; Gouskos, L; Mertzimekis, T J; Panagiotou, A; Stiliaris, E; Evangelou, I; Foudas, C; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Aranyi, A; Bencze, G; Boldizsar, L; Hajdu, C; Hidas, P; Horvath, D; Kapusi, A; Krajczar, K; Sikler, F; Veres, G I; Vesztergombi, G; Beni, N; Molnar, J; Palinkas, J; Szillasi, Z; Veszpremi, V; Raics, P; Trocsanyi, Z L; Ujvari, B; Beri, S B; Bhatnagar, V; Dhingra, N; Gupta, R; Jindal, M; Kaur, M; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A P; Singh, J; Singh, S P; Ahuja, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Kumar, A; Kumar, A; Naimuddin, M; Ranjan, K; Shivpuri, R K; Banerjee, S; Bhattacharya, S; Dutta, S; Gomber, B; Khurana, R; Sarkar, S; Choudhury, R K; Dutta, D; Kailas, S; Kumar, V; Mehta, P; Mohanty, A K; Pant, L M; Shukla, P; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Mohanty, G B; Saha, A; Sudhakar, K; Wickramage, N; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Etesami, S M; Fahim, A; Hashemi, M; Jafari, A; Khakzad, M; Mohammadi, A; Najafabadi, M Mohammadi; Mehdiabadi, S Paktinat; Safarzadeh, B; Zeinali, M; Abbrescia, M; Barbone, L; Calabria, C; Colaleo, A; Creanza, D; De Filippis, N; De Palma, M; Fiore, L; Iaselli, G; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Nuzzo, S; Pacifico, N; Pierro, G A; Pompili, A; Pugliese, G; Romano, F; Roselli, G; Selvaggi, G; Silvestris, L; Trentadue, R; Tupputi, S; Zito, G; Abbiendi, G; Benvenuti, A C; Bonacorsi, D; Braibant-Giacomelli, S; Brigliadori, L; Capiluppi, P; Castro, A; Cavallo, F R; Cuffiani, M; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giunta, M; Grandi, C; Marcellini, S; Masetti, G; Meneghelli, M; Montanari, A; Navarria, F L; Odorici, F; Perrotta, A; Primavera, F; Rossi, A M; Rovelli, T; Siroli, G; Travaglini, R; Albergo, S; Cappello, G; Chiorboli, M; Costa, S; Tricomi, A; Tuve, C; Barbagli, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Gonzi, S; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bianco, S; Colafranceschi, S; Fabbri, F; Piccolo, D; Fabbricatore, P; Musenich, R; Benaglia, A; De Guio, F; Di Matteo, L; Gennai, S; Ghezzi, A; Malvezzi, S; Martelli, A; Massironi, A; Menasce, D; Moroni, L; Paganoni, M; Pedrini, D; Ragazzi, S; Redaelli, N; Sala, S; Tabarelli de Fatis, T; Buontempo, S; Montoya, C A Carrillo; Cavallo, N; De Cosa, A; Fabozzi, F; Iorio, A O M; Lista, L; Merola, M; Paolucci, P; Azzi, P; Bacchetta, N; Bellan, P; Biasotto, M; Bisello, D; Branca, A; Carlin, R; Checchia, P; Dorigo, T; Gasparini, F; Gozzelino, A; Gulmini, M; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Meneguzzo, A T; Nespolo, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Ratti, S P; Riccardi, C; Torre, P; Vitulo, P; Viviani, C; Biasini, M; Bilei, G M; Caponeri, B; Fanò, L; Lariccia, P; Lucaroni, A; Mantovani, G; Menichelli, M; Nappi, A; Romeo, F; Santocchia, A; Taroni, S; Valdata, M; Azzurri, P; Bagliesi, G; Bernardini, J; Boccali, T; Broccolo, G; Castaldi, R; D'Agnolo, R T; Dell'Oso, R; Fiori, F; Foà, L; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Martini, L; Messineo, A; Palla, F; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tonelli, G; Venturi, A; Verdini, P G; Barone, L; Cavallari, F; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Grassi, M; Longo, E; Meridiani, P; Nourbakhsh, S; Organtini, G; Pandolfi, F; Paramatti, R; Rahatlou, S; Rovelli, C; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Botta, C; Cartiglia, N; Castello, R; Costa, M; Demaria, N; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Obertino, M M; Pastrone, N; Pelliccioni, M; Potenza, A; Romero, A; Ruspa, M; Sacchi, R; Sola, V; Solano, A; Staiano, A; Pereira, A Vilela; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Montanino, D; Penzo, A; Heo, S G; Nam, S K; Chang, S; Chung, J; Kim, D H; Kim, G N; Kim, J E; Kong, D J; Park, H; Ro, S R; Son, D; Son, D C; Son, T; Kim, Zero; Kim, J Y; Song, S; Choi, S; Hong, B; Jo, M; Kim, H; Kim, J H; Kim, T J; Lee, K S; Moon, D H; Park, S K; Sim, K S; Choi, M; Kang, S; Kim, H; Park, C; Park, I C; Park, S; Ryu, G; Choi, Y; Choi, Y K; Goh, J; Kim, M S; Lee, J; Lee, S; Seo, H; Yu, I; Bilinskas, M J; Grigelionis, I; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla-Valdez, H; De La Cruz-Burelo, E; Heredia-de La Cruz, I; Lopez-Fernandez, R; Villalba, R Magaña; Sánchez-Hernández, A; Villasenor-Cendejas, L M; Moreno, S Carrillo; Valencia, F Vazquez; Ibarguen, H A Salazar; Linares, E Casimiro; Pineda, A Morelos; Reyes-Santos, M A; Krofcheck, D; Tam, J; Butler, P H; Doesburg, R; Silverwood, H; Ahmad, M; Ahmed, I; Asghar, M I; Hoorani, H R; Khan, W A; Khurshid, T; Qazi, S; Brona, G; Cwiok, M; Dominik, W; Doroba, K; Kalinowski, A; Konecki, M; Krolikowski, J; Frueboes, T; Gokieli, R; Górski, M; Kazana, M; Nawrocki, K; Romanowska-Rybinska, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Bargassa, P; David, A; Faccioli, P; Parracho, P G Ferreira; Gallinaro, M; Musella, P; Nayak, A; Pela, J; Ribeiro, P Q; Seixas, J; Varela, J; Afanasiev, S; Belotelov, I; Bunin, P; Golutvin, I; Karjavin, V; Kozlov, G; Lanev, A; Moisenz, P; Palichik, V; Perelygin, V; Savina, M; Shmatov, S; Smirnov, V; Volodko, A; Zarubin, A; Golovtsov, V; Ivanov, Y; Kim, V; Levchenko, P; Murzin, V; Oreshkin, V; Smirnov, I; Sulimov, V; Uvarov, L; Vavilov, S; Vorobyev, A; Vorobyev, An; Andreev, Yu; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Toropin, A; Troitsky, S; Epshteyn, V; Gavrilov, V; Kaftanov, V; Kossov, M; Krokhotin, A; Lychkovskaya, N; Popov, V; Safronov, G; Semenov, S; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Kodolova, O; Lokhtin, I; Markina, A; Obraztsov, S; Perfilov, M; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Andreev, V; Azarkin, M; Dremin, I; Kirakosyan, M; Leonidov, A; Rusakov, S V; Vinogradov, A; Azhgirey, I; Bayshev, I; Bitioukov, S; Grishin, V; Kachanov, V; Konstantinov, D; Korablev, A; Krychkine, V; Petrov, V; Ryutin, R; Sobol, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Krpic, D; Milosevic, J; Aguilar-Benitez, M; Maestre, J Alcaraz; Arce, P; Battilana, C; Calvo, E; Cepeda, M; Cerrada, M; Llatas, M Chamizo; Colino, N; De La Cruz, B; Peris, A Delgado; Pardos, C Diez; Vázquez, D Domínguez; Bedoya, C Fernandez; Ramos, J P Fernández; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Lopez, O Gonzalez; Lopez, S Goy; Hernandez, J M; Josa, M I; Merino, G; Pelayo, J Puerta; Redondo, I; Romero, L; Santaolalla, J; Soares, M S; Willmott, C; Albajar, C; Codispoti, G; de Trocóniz, J F; Cuevas, J; Menendez, J Fernandez; Folgueras, S; Caballero, I Gonzalez; Iglesias, L Lloret; Garcia, J M Vizan; Cifuentes, J A Brochero; Cabrillo, I J; Calderon, A; Chuang, S H; Campderros, J Duarte; Felcini, M; Fernandez, M; Gomez, G; Sanchez, J Gonzalez; Jorda, C; Pardo, P Lobelle; Virto, A Lopez; Marco, J; Marco, R; Rivero, C Martinez; Matorras, F; Sanchez, F J Munoz; Gomez, J Piedra; Rodrigo, T; Rodríguez-Marrero, A Y; Ruiz-Jimeno, A; Scodellaro, L; Sanudo, M Sobron; Vila, I; Cortabitarte, R Vilar; Abbaneo, D; Auffray, E; Auzinger, G; Baillon, P; Ball, A H; Barney, D; Bell, A J; Benedetti, D; Bernet, C; Bialas, W; Bloch, P; Bocci, A; Bolognesi, S; Bona, M; Breuker, H; Bunkowski, K; Camporesi, T; Cerminara, G; Christiansen, T; Perez, J A Coarasa; Curé, B; D'Enterria, D; De Roeck, A; Di Guida, S; Dupont-Sagorin, N; Elliott-Peisert, A; Frisch, B; Funk, W; Gaddi, A; Georgiou, G; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Glege, F; Garrido, R Gomez-Reino; Gouzevitch, M; Govoni, P; Gowdy, S; Guiducci, L; Hansen, M; Hartl, C; Harvey, J; Hegeman, J; Hegner, B; Hoffmann, H F; Honma, A; Innocente, V; Janot, P; Kaadze, K; Karavakis, E; Lecoq, P; Lourenço, C; Mäki, T; Malberti, M; Malgeri, L; Mannelli, M; Masetti, L; Maurisset, A; Meijers, F; Mersi, S; Meschi, E; Moser, R; Mozer, M U; Mulders, M; Nesvold, E; Nguyen, M; Orimoto, T; Orsini, L; Perez, E; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Piparo, D; Polese, G; Racz, A; Antunes, J Rodrigues; Rolandi, G; Rommerskirchen, T; Rovere, M; Sakulin, H; Schäfer, C; Schwick, C; Segoni, I; Sharma, A; Siegrist, P; Simon, M; Sphicas, P; Spiropulu, M; Stoye, M; Tropea, P; Tsirou, A; Vichoudis, P; Voutilainen, M; Zeuner, W D; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Bäni, L; Bortignon, P; Caminada, L; Chanon, N; Chen, Z; Cittolin, S; Dissertori, G; Dittmar, M; Eugster, J; Freudenreich, K; Grab, C; Hintz, W; Lecomte, P; Lustermann, W; Marchica, C; Ruiz del Arbol, P Martinez; Milenovic, P; Moortgat, F; Nägeli, C; Nef, P; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Rossini, M; Sala, L; Sanchez, A K; Sawley, M-C; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Urscheler, C; Wallny, R; Weber, M; Wehrli, L; Weng, J; Aguilo, E; Amsler, C; Chiochia, V; De Visscher, S; Favaro, C; Rikova, M Ivova; Mejias, B Millan; Otiougova, P; Regenfus, C; Robmann, P; Schmidt, A; Snoek, H; Chang, Y H; Chen, K H; Kuo, C M; Li, S W; Lin, W; Liu, Z K; Lu, Y J; Mekterovic, D; Volpe, R; Wu, J H; Yu, S S; Bartalini, P; Chang, P; Chang, Y H; Chang, Y W; Chao, Y; Chen, K F; Hou, W-S; Hsiung, Y; Kao, K Y; Lei, Y J; Lu, R-S; Shiu, J G; Tzeng, Y M; Wang, M; Adiguzel, A; Bakirci, M N; Cerci, S; Dozen, C; Dumanoglu, I; Eskut, E; Girgis, S; Gokbulut, G; Hos, I; Kangal, E E; Topaksu, A Kayis; Onengut, G; Ozdemir, K; Ozturk, S; Polatoz, A; Sogut, K; Cerci, D Sunar; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilin, B; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Ocalan, K; Ozpineci, A; Serin, M; Sever, R; Surat, U E; Yildirim, E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Isildak, B; Kaya, M; Kaya, O; Ozbek, M; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Bostock, F; Brooke, J J; Cheng, T L; Clement, E; Cussans, D; Frazier, R; Goldstein, J; Grimes, M; Hansen, M; Hartley, D; Heath, G P; Heath, H F; Kreczko, L; Metson, S; Newbold, D M; Nirunpong, K; Poll, A; Senkin, S; Smith, V J; Ward, S; Basso, L; Bell, K W; Belyaev, A; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Harder, K; Harper, S; Jackson, J; Kennedy, B W; Olaiya, E; Petyt, D; Radburn-Smith, B C; Shepherd-Themistocleous, C H; Tomalin, I R; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Cutajar, M; Davies, G; Della Negra, M; Ferguson, W; Fulcher, J; Futyan, D; Gilbert, A; Bryer, A Guneratne; Hall, G; Hatherell, Z; Hays, J; Iles, G; Jarvis, M; Karapostoli, G; Lyons, L; MacEvoy, B C; Magnan, A-M; Marrouche, J; Mathias, B; Nandi, R; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rogerson, S; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sparrow, A; Tapper, A; Tourneur, S; Acosta, M Vazquez; Virdee, T; Wakefield, S; Wardle, N; Wardrope, D; Whyntie, T; Barrett, M; Chadwick, M; Cole, J E; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Martin, W; Reid, I D; Teodorescu, L; Hatakeyama, K; Liu, H; Henderson, C; Bose, T; Jarrin, E Carrera; Fantasia, C; Heister, A; St John, J; Lawson, P; Lazic, D; Rohlf, J; Sperka, D; Sulak, L; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Ferapontov, A; Heintz, U; Jabeen, S; Kukartsev, G; Landsberg, G; Luk, M; Narain, M; Nguyen, D; Segala, M; Sinthuprasith, T; Speer, T; Tsang, K V; Breedon, R; Breto, G; Calderon De La Barca Sanchez, M; Chauhan, S; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Salur, S; Schwarz, T; Searle, M; Smith, J; Squires, M; Tripathi, M; Sierra, R Vasquez; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Deisher, A; Duris, J; Erhan, S; Farrell, C; Hauser, J; Ignatenko, M; Jarvis, C; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Babb, J; Chandra, A; Clare, R; Ellison, J; Gary, J W; Giordano, F; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Long, O R; Luthra, A; Nguyen, H; Shen, B C; Stringer, R; Sturdy, J; Sumowidagdo, S; Wilken, R; Wimpenny, S; Andrews, W; Branson, J G; Cerati, G B; Evans, D; Golf, F; Holzner, A; Kelley, R; Lebourgeois, M; Letts, J; Mangano, B; Padhi, S; Palmer, C; Petrucciani, G; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Sudano, E; Tadel, M; Tu, Y; Vartak, A; Wasserbaech, S; Würthwein, F; Yagil, A; Yoo, J; Barge, D; Bellan, R; Campagnari, C; D'Alfonso, M; Danielson, T; Flowers, K; Geffert, P; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lowette, S; Mccoll, N; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Apresyan, A; Bornheim, A; Bunn, J; Chen, Y; Gataullin, M; Ma, Y; Mott, A; Newman, H B; Rogan, C; Shin, K; Timciuc, V; Traczyk, P; Veverka, J; Wilkinson, R; Yang, Y; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Iiyama, Y; Jang, D W; Jun, S Y; Liu, Y F; Paulini, M; Russ, J; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Edelmaier, C J; Ford, W T; Gaz, A; Heyburn, B; Lopez, E Luiggi; Nauenberg, U; Smith, J G; Stenson, K; Ulmer, K A; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Cassel, D; Chatterjee, A; Das, S; Eggert, N; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kaufman, G Nicolas; Patterson, J R; Puigh, D; Ryd, A; Salvati, E; Shi, X; Sun, W; Teo, W D; Thom, J; Thompson, J; Vaughan, J; Weng, Y; Winstrom, L; Wittich, P; Biselli, A; Cirino, G; Winn, D; Abdullin, S; Albrow, M; Anderson, J; Apollinari, G; Atac, M; Bakken, J A; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Bloch, I; Borcherding, F; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Cihangir, S; Cooper, W; Eartly, D P; Elvira, V D; Esen, S; Fisk, I; Freeman, J; Gao, Y; Gottschalk, E; Green, D; Gunthoti, K; Gutsche, O; Hanlon, J; Harris, R M; Hirschauer, J; Hooberman, B; Jensen, H; Johnson, M; Joshi, U; Khatiwada, R; Klima, B; Kousouris, K; Kunori, S; Kwan, S; Leonidopoulos, C; Limon, P; Lincoln, D; Lipton, R; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Mrenna, S; Musienko, Y; Newman-Holmes, C; O'Dell, V; Pordes, R; Prokofyev, O; Saoulidou, N; Sexton-Kennedy, E; Sharma, S; Spalding, W J; Spiegel, L; Tan, P; Taylor, L; Tkaczyk, S; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wu, W; Yang, F; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Bourilkov, D; Chen, M; De Gruttola, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fisher, M; Fu, Y; Furic, I K; Gartner, J; Kim, B; Konigsberg, J; Korytov, A; Kropivnitskaya, A; Kypreos, T; Matchev, K; Mitselmakher, G; Muniz, L; Prescott, C; Remington, R; Schmitt, M; Scurlock, B; Sellers, P; Skhirtladze, N; Snowball, M; Wang, D; Yelton, J; Zakaria, M; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Mesa, D; Rodriguez, J L; Adams, T; Askew, A; Bochenek, J; Chen, J; Diamond, B; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prosper, H; Quertenmont, L; Sekmen, S; Veeraraghavan, V; Baarmand, M M; Dorney, B; Guragain, S; Hohlmann, M; Kalakhety, H; Ralich, R; Vodopiyanov, I; Adams, M R; Anghel, I M; Apanasevich, L; Bai, Y; Bazterra, V E; Betts, R R; Callner, J; Cavanaugh, R; Dragoiu, C; Gauthier, L; Gerber, C E; Hofman, D J; Khalatyan, S; Kunde, G J; Lacroix, F; Malek, M; O'Brien, C; Silkworth, C; Silvestre, C; Smoron, A; Strom, D; Varelas, N; Akgun, U; Albayrak, E A; Bilki, B; Clarida, W; Duru, F; Lae, C K; McCliment, E; Merlo, J-P; Mermerkaya, H; Mestvirishvili, A; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Eskew, C; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Hu, G; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Whitbeck, A; Baringer, P; Bean, A; Benelli, G; Grachov, O; Kenny, R P; Murray, M; Noonan, D; Sanders, S; Wood, J S; Zhukova, V; Barfuss, A F; Bolton, T; Chakaberia, I; Ivanov, A; Khalil, S; Makouski, M; Maravin, Y; Shrestha, S; Svintradze, I; Wan, Z; Gronberg, J; Lange, D; Wright, D; Baden, A; Boutemeur, M; Eno, S C; Ferencek, D; Gomez, J A; Hadley, N J; Kellogg, R G; Kirn, M; Lu, Y; Mignerey, A C; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; Dutta, V; Everaerts, P; Ceballos, G Gomez; Goncharov, M; Hahn, K A; Harris, P; Kim, Y; Klute, M; Lee, Y-J; Li, W; Loizides, C; Luckey, P D; Ma, T; Nahn, S; Paus, C; Ralph, D; Roland, C; Roland, G; Rudolph, M; Stephans, G S F; Stöckli, F; Sumorok, K; Sung, K; Wenger, E A; Wolf, R; Xie, S; Yang, M; Yilmaz, Y; Yoon, A S; Zanetti, M; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dudero, P R; Franzoni, G; Haupt, J; Klapoetke, K; Kubota, Y; Mans, J; Pastika, N; Rekovic, V; Rusack, R; Sasseville, M; Singovsky, A; Tambe, N; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Summers, D; Bloom, K; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Godshalk, A; Iashvili, I; Jain, S; Kharchilava, A; Kumar, A; Shipkowski, S P; Smith, K; Zennamo, J; Alverson, G; Barberis, E; Baumgartel, D; Boeriu, O; Chasco, M; Reucroft, S; Swain, J; Trocino, D; Wood, D; Zhang, J; Anastassov, A; Kubik, A; Odell, N; Ofierzynski, R A; Pollack, B; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Brinkerhoff, A; Hildreth, M; Jessop, C; Karmgard, D J; Kolb, J; Kolberg, T; Lannon, K; Luo, W; Lynch, S; Marinelli, N; Morse, D M; Pearson, T; Ruchti, R; Slaunwhite, J; Valls, N; Wayne, M; Ziegler, J; Bylsma, B; Durkin, L S; Gu, J; Hill, C; Killewald, P; Kotov, K; Ling, T Y; Rodenburg, M; Williams, G; Adam, N; Berry, E; Elmer, P; Gerbaudo, D; Halyo, V; Hebda, P; Hunt, A; Jones, J; Laird, E; Pegna, D Lopes; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Quan, X; Saka, H; Stickland, D; Tully, C; Werner, J S; Zuranski, A; Acosta, J G; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Vargas, J E Ramirez; Zatserklyaniy, A; Alagoz, E; Barnes, V E; Bolla, G; Borrello, L; Bortoletto, D; De Mattia, M; Everett, A; Garfinkel, A F; Gutay, L; Hu, Z; Jones, M; Koybasi, O; Kress, M; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Shipsey, I; Silvers, D; Svyatkovskiy, A; Yoo, H D; Zablocki, J; Zheng, Y; Jindal, P; Parashar, N; Boulahouache, C; Ecklund, K M; Geurts, F J M; Padley, B P; Redjimi, R; Roberts, J; Zabel, J; Betchart, B; Bodek, A; Chung, Y S; Covarelli, R; de Barbaro, P; Demina, R; Eshaq, Y; Flacher, H; Garcia-Bellido, A; Goldenzweig, P; Gotra, Y; Han, J; Harel, A; Miner, D C; Orbaker, D; Petrillo, G; Sakumoto, W; Vishnevskiy, D; Zielinski, M; Bhatti, A; Ciesielski, R; Demortier, L; Goulianos, K; Lungu, G; Malik, S; Mesropian, C; Yan, M; Atramentov, O; Barker, A; Duggan, D; Gershtein, Y; Gray, R; Halkiadakis, E; Hidas, D; Hits, D; Lath, A; Panwalkar, S; Patel, R; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Eusebi, R; Flanagan, W; Gilmore, J; Gurrola, A; Kamon, T; Khotilovich, V; Montalvo, R; Osipenkov, I; Pakhotin, Y; Pivarski, J; Safonov, A; Sengupta, S; Tatarinov, A; Toback, D; Weinberger, M; Akchurin, N; Bardak, C; Damgov, J; Jeong, C; Kovitanggoon, K; Lee, S W; Libeiro, T; Mane, P; Roh, Y; Sill, A; Volobouev, I; Wigmans, R; Yazgan, E; Appelt, E; Brownson, E; Engh, D; Florez, C; Gabella, W; Issah, M; Johns, W; Kurt, P; Maguire, C; Melo, A; Sheldon, P; Snook, B; Tuo, S; Velkovska, J; Arenton, M W; Balazs, M; Boutle, S; Cox, B; Francis, B; Hirosky, R; Ledovskoy, A; Lin, C; Neu, C; Yohay, R; Gollapinni, S; Harr, R; Karchin, P E; Lamichhane, P; Mattson, M; Milstène, C; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Dasu, S; Efron, J; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Herndon, M; Hervé, A; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Mohapatra, A; Palmonari, F; Reeder, D; Ross, I; Savin, A; Smith, W H; Swanson, J; Weinberg, M
2011-11-11
A study of events with missing transverse energy and an energetic jet is performed using pp collision data at a center-of-mass energy of 7 TeV. The data were collected by the CMS detector at the LHC, and correspond to an integrated luminosity of 36 pb(-1). An excess of these events over standard model contributions is a signature of new physics such as large extra dimensions and unparticles. The number of observed events is in good agreement with the prediction of the standard model, and significant extension of the current limits on parameters of new physics benchmark models is achieved.
Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly
NASA Astrophysics Data System (ADS)
Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.
2014-04-01
We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.
NASA Astrophysics Data System (ADS)
Zhirkin, A. V.; Alekseev, P. N.; Batyaev, V. F.; Gurevich, M. I.; Dudnikov, A. A.; Kuteev, B. V.; Pavlov, K. V.; Titarenko, Yu. E.; Titarenko, A. Yu.
2017-06-01
In this report the calculation accuracy requirements of the main parameters of the fusion neutron source, and the thermonuclear blankets with a DT fusion power of more than 10 MW, are formulated. To conduct the benchmark experiments the technical documentation and calculation models were developed for two blanket micro-models: the molten salt and the heavy water solid-state blankets. The calculations of the neutron spectra, and 37 dosimetric reaction rates that are widely used for the registration of thermal, resonance and threshold (0.25-13.45 MeV) neutrons, were performed for each blanket micro-model. The MCNP code and the neutron data library ENDF/B-VII were used for the calculations. All the calculations were performed for two kinds of neutron source: source I is the fusion source, source II is the source of neutrons generated by the 7Li target irradiated by protons with energy 24.6 MeV. The spectral indexes ratios were calculated to describe the spectrum variations from different neutron sources. The obtained results demonstrate the advantage of using the fusion neutron source in future experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Timothy F. G., E-mail: tim.green@materials.ox.ac.uk; Yates, Jonathan R., E-mail: jonathan.yates@materials.ox.ac.uk
2014-06-21
We present a method for the first-principles calculation of nuclear magnetic resonance (NMR) J-coupling in extended systems using state-of-the-art ultrasoft pseudopotentials and including scalar-relativistic effects. The use of ultrasoft pseudopotentials is allowed by extending the projector augmented wave (PAW) method of Joyce et al. [J. Chem. Phys. 127, 204107 (2007)]. We benchmark it against existing local-orbital quantum chemical calculations and experiments for small molecules containing light elements, with good agreement. Scalar-relativistic effects are included at the zeroth-order regular approximation level of theory and benchmarked against existing local-orbital quantum chemical calculations and experiments for a number of small molecules containing themore » heavy row six elements W, Pt, Hg, Tl, and Pb, with good agreement. Finally, {sup 1}J(P-Ag) and {sup 2}J(P-Ag-P) couplings are calculated in some larger molecular crystals and compared against solid-state NMR experiments. Some remarks are also made as to improving the numerical stability of dipole perturbations using PAW.« less
The Hohlraum Drive Campaign on the National Ignition Facility
NASA Astrophysics Data System (ADS)
Moody, John D.
2013-10-01
The Hohlraum drive effort on the National Ignition Facility (NIF) laser has three primary goals: 1) improve hohlraum performance by improving laser beam propagation, reducing backscatter from laser plasma interactions (LPI), controlling x-ray and electron preheat, and modifying the x-ray drive spectrum; 2) improve understanding of crossbeam energy transfer physics to better evaluate this as a symmetry tuning method; and 3) improve modeling in order to find optimum designs. Our experimental strategy for improving performance explores the impact of significant changes to the hohlraum shape, wall material, gasfill composition, and gasfill density on integrated implosion experiments. We are investigating the performance of a rugby-shaped design that has a significantly larger diameter (7 mm) at the waist than our standard 5.75 mm diameter cylindrical-shaped hohlraum but maintains approximately the same wall area. We are also exploring changes to the gasfill composition in cylindrical hohlraums by using neopentane at room temperature to compare with our standard helium gasfill. In addition, we are also investigating higher He gasfill density (1.6 mg/cc vs nominal 0.96 mg/cc) and increased x-ray drive very early in the pulse. Besides these integrated experiments, our strategy includes experiments testing separate aspects of the hohlraum physics. These include time-resolved and time-integrated measurements of cross-beam transfer rates and laser-beam spatial power distribution at early and late times using modified targets. Non-local thermal equilibrium modeling and heat transport relevant to ignition experiments are being studied using sphere targets on the Omega laser system. These simpler targets provide benchmarks for improving our modeling tools. This talk will summarize the results of the Hohlraum Drive campaign and discuss future directions. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA2-344.
Protein docking prediction using predicted protein-protein interface.
Li, Bin; Kihara, Daisuke
2012-01-10
Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.
NRL 1989 Beam Propagation Studies in Support of the ATA Multi-Pulse Propagation Experiment
1990-08-31
papers presented here were all written prior to the completion of the experiment. The first of these papers presents simulation results which modeled ...beam stability and channel evolution for an entire five pulse burst. The second paper describes a new air chemistry model used in the SARLAC...Experiment: A new air chemistry model for use in the propagation codes simulating the MPPE was developed by making analytic fits to benchmark runs with
Design of an instrument to measure the quality of care in Physical Therapy
Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; do Prado, Cristiane; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo
2015-01-01
ABSTRACT Objective: To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. Methods: The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Results: Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. Conclusion: The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care. PMID:26154548
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harms, Gary A.; Ford, John T.; Barber, Allison Delo
2010-11-01
Sandia National Laboratories (SNL) has conducted radiation effects testing for the Department of Energy (DOE) and other contractors supporting the DOE since the 1960's. Over this period, the research reactor facilities at Sandia have had a primary mission to provide appropriate nuclear radiation environments for radiation testing and qualification of electronic components and other devices. The current generation of reactors includes the Annular Core Research Reactor (ACRR), a water-moderated pool-type reactor, fueled by elements constructed from UO2-BeO ceramic fuel pellets, and the Sandia Pulse Reactor III (SPR-III), a bare metal fast burst reactor utilizing a uranium-molybdenum alloy fuel. The SPR-IIImore » is currently defueled. The SPR Facility (SPRF) has hosted a series of critical experiments. A purpose-built critical experiment was first operated at the SPRF in the late 1980's. This experiment, called the Space Nuclear Thermal Propulsion Critical Experiment (CX), was designed to explore the reactor physics of a nuclear thermal rocket motor. This experiment was fueled with highly-enriched uranium carbide fuel in annular water-moderated fuel elements. The experiment program was completed and the fuel for the experiment was moved off-site. A second critical experiment, the Burnup Credit Critical Experiment (BUCCX) was operated at Sandia in 2002. The critical assembly for this experiment was based on the assembly used in the CX modified to accommodate low-enriched pin-type fuel in water moderator. This experiment was designed as a platform in which the reactivity effects of specific fission product poisons could be measured. Experiments were carried out on rhodium, an important fission product poison. The fuel and assembly hardware for the BUCCX remains at Sandia and is available for future experimentation. The critical experiment currently in operation at the SPRF is the Seven Percent Critical Experiment (7uPCX). This experiment is designed to provide benchmark reactor physics data to support validation of the reactor physics codes used to design commercial reactor fuel elements in an enrichment range above the current 5% enrichment cap. A first set of critical experiments in the 7uPCX has been completed. More experiments are planned in the 7uPCX series. The critical experiments at Sandia National Laboratories are currently funded by the US Department of Energy Nuclear Criticality Safety Program (NCSP). The NCSP has committed to maintain the critical experiment capability at Sandia and to support the development of a critical experiments training course at the facility. The training course is intended to provide hands-on experiment experience for the training of new and re-training of practicing Nuclear Criticality Safety Engineers. The current plans are for the development of the course to continue through the first part of fiscal year 2011 with the development culminating is the delivery of a prototype of the course in the latter part of the fiscal year. The course will be available in fiscal year 2012.« less
2011-03-25
number one and Nebulae at number three. Both systems rely on GPU co-processing and use Intel Xeon processors cards and NVIDIA Tesla C2050 GPUs. In...spite of a theoretical peak capability of almost 3 Petaflop/s, Nebulae clocked at 1.271 PFlop/s when running the Linpack benchmark, which puts it
ERIC Educational Resources Information Center
Stern, Luli; Roseman, Jo Ellen
2004-01-01
The transfer of matter and energy from one organism to another and between organisms and their physical setting is a fundamental concept in life science. Not surprisingly, this concept is common to the "Benchmarks for Science Literacy" (American Association for the Advancement of Science, [1993]), the "National Science Education Standards"…
A reduced-order, single-bubble cavitation model with applications to therapeutic ultrasound
Kreider, Wayne; Crum, Lawrence A.; Bailey, Michael R.; Sapozhnikov, Oleg A.
2011-01-01
Cavitation often occurs in therapeutic applications of medical ultrasound such as shock-wave lithotripsy (SWL) and high-intensity focused ultrasound (HIFU). Because cavitation bubbles can affect an intended treatment, it is important to understand the dynamics of bubbles in this context. The relevant context includes very high acoustic pressures and frequencies as well as elevated temperatures. Relative to much of the prior research on cavitation and bubble dynamics, such conditions are unique. To address the relevant physics, a reduced-order model of a single, spherical bubble is proposed that incorporates phase change at the liquid-gas interface as well as heat and mass transport in both phases. Based on the energy lost during the inertial collapse and rebound of a millimeter-sized bubble, experimental observations were used to tune and test model predictions. In addition, benchmarks from the published literature were used to assess various aspects of model performance. Benchmark comparisons demonstrate that the model captures the basic physics of phase change and diffusive transport, while it is quantitatively sensitive to specific model assumptions and implementation details. Given its performance and numerical stability, the model can be used to explore bubble behaviors across a broad parameter space relevant to therapeutic ultrasound. PMID:22088026
Development of a Computing Cluster At the University of Richmond
NASA Astrophysics Data System (ADS)
Carbonneau, J.; Gilfoyle, G. P.; Bunn, E. F.
2010-11-01
The University of Richmond has developed a computing cluster to support the massive simulation and data analysis requirements for programs in intermediate-energy nuclear physics, and cosmology. It is a 20-node, 240-core system running Red Hat Enterprise Linux 5. We have built and installed the physics software packages (Geant4, gemc, MADmap...) and developed shell and Perl scripts for running those programs on the remote nodes. The system has a theoretical processing peak of about 2500 GFLOPS. Testing with the High Performance Linpack (HPL) benchmarking program (one of the standard benchmarks used by the TOP500 list of fastest supercomputers) resulted in speeds of over 900 GFLOPS. The difference between the maximum and measured speeds is due to limitations in the communication speed among the nodes; creating a bottleneck for large memory problems. As HPL sends data between nodes, the gigabit Ethernet connection cannot keep up with the processing power. We will show how both the theoretical and actual performance of the cluster compares with other current and past clusters, as well as the cost per GFLOP. We will also examine the scaling of the performance when distributed to increasing numbers of nodes.
A reduced-order, single-bubble cavitation model with applications to therapeutic ultrasound.
Kreider, Wayne; Crum, Lawrence A; Bailey, Michael R; Sapozhnikov, Oleg A
2011-11-01
Cavitation often occurs in therapeutic applications of medical ultrasound such as shock-wave lithotripsy (SWL) and high-intensity focused ultrasound (HIFU). Because cavitation bubbles can affect an intended treatment, it is important to understand the dynamics of bubbles in this context. The relevant context includes very high acoustic pressures and frequencies as well as elevated temperatures. Relative to much of the prior research on cavitation and bubble dynamics, such conditions are unique. To address the relevant physics, a reduced-order model of a single, spherical bubble is proposed that incorporates phase change at the liquid-gas interface as well as heat and mass transport in both phases. Based on the energy lost during the inertial collapse and rebound of a millimeter-sized bubble, experimental observations were used to tune and test model predictions. In addition, benchmarks from the published literature were used to assess various aspects of model performance. Benchmark comparisons demonstrate that the model captures the basic physics of phase change and diffusive transport, while it is quantitatively sensitive to specific model assumptions and implementation details. Given its performance and numerical stability, the model can be used to explore bubble behaviors across a broad parameter space relevant to therapeutic ultrasound.
A hybridizable discontinuous Galerkin method for modeling fluid-structure interaction
NASA Astrophysics Data System (ADS)
Sheldon, Jason P.; Miller, Scott T.; Pitt, Jonathan S.
2016-12-01
This work presents a novel application of the hybridizable discontinuous Galerkin (HDG) finite element method to the multi-physics simulation of coupled fluid-structure interaction (FSI) problems. Recent applications of the HDG method have primarily been for single-physics problems including both solids and fluids, which are necessary building blocks for FSI modeling. Utilizing these established models, HDG formulations for linear elastostatics, a nonlinear elastodynamic model, and arbitrary Lagrangian-Eulerian Navier-Stokes are derived. The elasticity formulations are written in a Lagrangian reference frame, with the nonlinear formulation restricted to hyperelastic materials. With these individual solid and fluid formulations, the remaining challenge in FSI modeling is coupling together their disparate mathematics on the fluid-solid interface. This coupling is presented, along with the resultant HDG FSI formulation. Verification of the component models, through the method of manufactured solutions, is performed and each model is shown to converge at the expected rate. The individual components, along with the complete FSI model, are then compared to the benchmark problems proposed by Turek and Hron [1]. The solutions from the HDG formulation presented in this work trend towards the benchmark as the spatial polynomial order and the temporal order of integration are increased.
A hybridizable discontinuous Galerkin method for modeling fluid–structure interaction
Sheldon, Jason P.; Miller, Scott T.; Pitt, Jonathan S.
2016-08-31
This study presents a novel application of the hybridizable discontinuous Galerkin (HDG) finite element method to the multi-physics simulation of coupled fluid–structure interaction (FSI) problems. Recent applications of the HDG method have primarily been for single-physics problems including both solids and fluids, which are necessary building blocks for FSI modeling. Utilizing these established models, HDG formulations for linear elastostatics, a nonlinear elastodynamic model, and arbitrary Lagrangian–Eulerian Navier–Stokes are derived. The elasticity formulations are written in a Lagrangian reference frame, with the nonlinear formulation restricted to hyperelastic materials. With these individual solid and fluid formulations, the remaining challenge in FSI modelingmore » is coupling together their disparate mathematics on the fluid–solid interface. This coupling is presented, along with the resultant HDG FSI formulation. Verification of the component models, through the method of manufactured solutions, is performed and each model is shown to converge at the expected rate. The individual components, along with the complete FSI model, are then compared to the benchmark problems proposed by Turek and Hron [1]. The solutions from the HDG formulation presented in this work trend towards the benchmark as the spatial polynomial order and the temporal order of integration are increased.« less
Experimental benchmarking of quantum control in zero-field nuclear magnetic resonance.
Jiang, Min; Wu, Teng; Blanchard, John W; Feng, Guanru; Peng, Xinhua; Budker, Dmitry
2018-06-01
Demonstration of coherent control and characterization of the control fidelity is important for the development of quantum architectures such as nuclear magnetic resonance (NMR). We introduce an experimental approach to realize universal quantum control, and benchmarking thereof, in zero-field NMR, an analog of conventional high-field NMR that features less-constrained spin dynamics. We design a composite pulse technique for both arbitrary one-spin rotations and a two-spin controlled-not (CNOT) gate in a heteronuclear two-spin system at zero field, which experimentally demonstrates universal quantum control in such a system. Moreover, using quantum information-inspired randomized benchmarking and partial quantum process tomography, we evaluate the quality of the control, achieving single-spin control for 13 C with an average fidelity of 0.9960(2) and two-spin control via a CNOT gate with a fidelity of 0.9877(2). Our method can also be extended to more general multispin heteronuclear systems at zero field. The realization of universal quantum control in zero-field NMR is important for quantum state/coherence preparation, pulse sequence design, and is an essential step toward applications to materials science, chemical analysis, and fundamental physics.
Experimental benchmarking of quantum control in zero-field nuclear magnetic resonance
Feng, Guanru
2018-01-01
Demonstration of coherent control and characterization of the control fidelity is important for the development of quantum architectures such as nuclear magnetic resonance (NMR). We introduce an experimental approach to realize universal quantum control, and benchmarking thereof, in zero-field NMR, an analog of conventional high-field NMR that features less-constrained spin dynamics. We design a composite pulse technique for both arbitrary one-spin rotations and a two-spin controlled-not (CNOT) gate in a heteronuclear two-spin system at zero field, which experimentally demonstrates universal quantum control in such a system. Moreover, using quantum information–inspired randomized benchmarking and partial quantum process tomography, we evaluate the quality of the control, achieving single-spin control for 13C with an average fidelity of 0.9960(2) and two-spin control via a CNOT gate with a fidelity of 0.9877(2). Our method can also be extended to more general multispin heteronuclear systems at zero field. The realization of universal quantum control in zero-field NMR is important for quantum state/coherence preparation, pulse sequence design, and is an essential step toward applications to materials science, chemical analysis, and fundamental physics. PMID:29922714
ForceGen 3D structure and conformer generation: from small lead-like molecules to macrocyclic drugs
NASA Astrophysics Data System (ADS)
Cleves, Ann E.; Jain, Ajay N.
2017-05-01
We introduce the ForceGen method for 3D structure generation and conformer elaboration of drug-like small molecules. ForceGen is novel, avoiding use of distance geometry, molecular templates, or simulation-oriented stochastic sampling. The method is primarily driven by the molecular force field, implemented using an extension of MMFF94s and a partial charge estimator based on electronegativity-equalization. The force field is coupled to algorithms for direct sampling of realistic physical movements made by small molecules. Results are presented on a standard benchmark from the Cambridge Crystallographic Database of 480 drug-like small molecules, including full structure generation from SMILES strings. Reproduction of protein-bound crystallographic ligand poses is demonstrated on four carefully curated data sets: the ConfGen Set (667 ligands), the PINC cross-docking benchmark (1062 ligands), a large set of macrocyclic ligands (182 total with typical ring sizes of 12-23 atoms), and a commonly used benchmark for evaluating macrocycle conformer generation (30 ligands total). Results compare favorably to alternative methods, and performance on macrocyclic compounds approaches that observed on non-macrocycles while yielding a roughly 100-fold speed improvement over alternative MD-based methods with comparable performance.
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented
MPI, HPF or OpenMP: A Study with the NAS Benchmarks
NASA Technical Reports Server (NTRS)
Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)
1999-01-01
Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.
Benchmarking comparison and validation of MCNP photon interaction data
NASA Astrophysics Data System (ADS)
Colling, Bethany; Kodeli, I.; Lilley, S.; Packer, L. W.
2017-09-01
The objective of the research was to test available photoatomic data libraries for fusion relevant applications, comparing against experimental and computational neutronics benchmarks. Photon flux and heating was compared using the photon interaction data libraries (mcplib 04p, 05t, 84p and 12p). Suitable benchmark experiments (iron and water) were selected from the SINBAD database and analysed to compare experimental values with MCNP calculations using mcplib 04p, 84p and 12p. In both the computational and experimental comparisons, the majority of results with the 04p, 84p and 12p photon data libraries were within 1σ of the mean MCNP statistical uncertainty. Larger differences were observed when comparing computational results with the 05t test photon library. The Doppler broadening sampling bug in MCNP-5 is shown to be corrected for fusion relevant problems through use of the 84p photon data library. The recommended libraries for fusion neutronics are 84p (or 04p) with MCNP6 and 84p if using MCNP-5.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lell, R.; Grimm, K.; McKnight, R.
The Zero Power Physics Reactor (ZPPR) fast critical facility was built at the Argonne National Laboratory-West (ANL-W) site in Idaho in 1969 to obtain neutron physics information necessary for the design of fast breeder reactors. The ZPPR-20D Benchmark Assembly was part of a series of cores built in Assembly 20 (References 1 through 3) of the ZPPR facility to provide data for developing a nuclear power source for space applications (SP-100). The assemblies were beryllium oxide reflected and had core fuel compositions containing enriched uranium fuel, niobium and rhenium. ZPPR-20 Phase C (HEU-MET-FAST-075) was built as the reference flight configuration.more » Two other configurations, Phases D and E, simulated accident scenarios. Phase D modeled the water immersion scenario during a launch accident, and Phase E (SUB-HEU-MET-FAST-001) modeled the earth burial scenario during a launch accident. Two configurations were recorded for the simulated water immersion accident scenario (Phase D); the critical configuration, documented here, and the subcritical configuration (SUB-HEU-MET-MIXED-001). Experiments in Assembly 20 Phases 20A through 20F were performed in 1988. The reference water immersion configuration for the ZPPR-20D assembly was obtained as reactor loading 129 on October 7, 1988 with a fissile mass of 167.477 kg and a reactivity of -4.626 {+-} 0.044{cents} (k {approx} 0.9997). The SP-100 core was to be constructed of highly enriched uranium nitride, niobium, rhenium and depleted lithium. The core design called for two enrichment zones with niobium-1% zirconium alloy fuel cladding and core structure. Rhenium was to be used as a fuel pin liner to provide shut down in the event of water immersion and flooding. The core coolant was to be depleted lithium metal ({sup 7}Li). The core was to be surrounded radially with a niobium reactor vessel and bypass which would carry the lithium coolant to the forward inlet plenum. Immediately inside the reactor vessel was a rhenium baffle which would act as a neutron curtain in the event of water immersion. A fission gas plenum and coolant inlet plenum were located axially forward of the core. Some material substitutions had to be made in mocking up the SP-100 design. The ZPPR-20 critical assemblies were fueled by 93% enriched uranium metal because uranium nitride, which was the SP-100 fuel type, was not available. ZPPR Assembly 20D was designed to simulate a water immersion accident. The water was simulated by polyethylene (CH{sub 2}), which contains a similar amount of hydrogen and has a similar density. A very accurate transformation to a simplified model is needed to make any of the ZPPR assemblies a practical criticality-safety benchmark. There is simply too much geometric detail in an exact model of a ZPPR assembly, particularly as complicated an assembly as ZPPR-20D. The transformation must reduce the detail to a practical level without masking any of the important features of the critical experiment. And it must do this without increasing the total uncertainty far beyond that of the original experiment. Such a transformation will be described in a later section. First, Assembly 20D was modeled in full detail--every plate, drawer, matrix tube, and air gap was modeled explicitly. Then the regionwise compositions and volumes from this model were converted to an RZ model. ZPPR Assembly 20D has been determined to be an acceptable criticality-safety benchmark experiment.« less
Frontal Polymerization in Microgravity
NASA Technical Reports Server (NTRS)
Pojman, John A.
1999-01-01
Frontal polymerization systems, with their inherent large thermal and compositional gradients, are greatly affected by buoyancy-driven convection. Sounding rocket experiments allowed the preparation of benchmark materials and demonstrated that methods to suppress the Rayleigh-Taylor instability in ground-based research did not significantly affect the molecular weight of the polymer. Experiments under weightlessness show clearly that bubbles produced during the reaction interact very differently than under 1 g.
ERIC Educational Resources Information Center
Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.
2013-01-01
When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…
Benchmarking Gas Path Diagnostic Methods: A Public Approach
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Bird, Jeff; Davison, Craig; Volponi, Al; Iverson, R. Eugene
2008-01-01
Recent technology reviews have identified the need for objective assessments of engine health management (EHM) technology. The need is two-fold: technology developers require relevant data and problems to design and validate new algorithms and techniques while engine system integrators and operators need practical tools to direct development and then evaluate the effectiveness of proposed solutions. This paper presents a publicly available gas path diagnostic benchmark problem that has been developed by the Propulsion and Power Systems Panel of The Technical Cooperation Program (TTCP) to help address these needs. The problem is coded in MATLAB (The MathWorks, Inc.) and coupled with a non-linear turbofan engine simulation to produce "snap-shot" measurements, with relevant noise levels, as if collected from a fleet of engines over their lifetime of use. Each engine within the fleet will experience unique operating and deterioration profiles, and may encounter randomly occurring relevant gas path faults including sensor, actuator and component faults. The challenge to the EHM community is to develop gas path diagnostic algorithms to reliably perform fault detection and isolation. An example solution to the benchmark problem is provided along with associated evaluation metrics. A plan is presented to disseminate this benchmark problem to the engine health management technical community and invite technology solutions.
Lance, Blake W.; Smith, Barton L.
2016-06-23
Transient convection has been investigated experimentally for the purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. A specialized facility for validation benchmark experiments called the Rotatable Buoyancy Tunnel was used to acquire thermal and velocity measurements of flow over a smooth, vertical heated plate. The initial condition was forced convection downward with subsequent transition to mixed convection, ending with natural convection upward after a flow reversal. Data acquisition through the transient was repeated for ensemble-averaged results. With simple flow geometry, validation data were acquired at the benchmark level. All boundary conditions (BCs) were measured and their uncertainties quantified.more » Temperature profiles on all four walls and the inlet were measured, as well as as-built test section geometry. Inlet velocity profiles and turbulence levels were quantified using Particle Image Velocimetry. System Response Quantities (SRQs) were measured for comparison with CFD outputs and include velocity profiles, wall heat flux, and wall shear stress. Extra effort was invested in documenting and preserving the validation data. Details about the experimental facility, instrumentation, experimental procedure, materials, BCs, and SRQs are made available through this paper. As a result, the latter two are available for download and the other details are included in this work.« less
Evaluation of the Pool Critical Assembly Benchmark with Explicitly-Modeled Geometry using MCNP6
Kulesza, Joel A.; Martz, Roger Lee
2017-03-01
Despite being one of the most widely used benchmarks for qualifying light water reactor (LWR) radiation transport methods and data, no benchmark calculation of the Oak Ridge National Laboratory (ORNL) Pool Critical Assembly (PCA) pressure vessel wall benchmark facility (PVWBF) using MCNP6 with explicitly modeled core geometry exists. As such, this paper provides results for such an analysis. First, a criticality calculation is used to construct the fixed source term. Next, ADVANTG-generated variance reduction parameters are used within the final MCNP6 fixed source calculations. These calculations provide unadjusted dosimetry results using three sets of dosimetry reaction cross sections of varyingmore » ages (those packaged with MCNP6, from the IRDF-2002 multi-group library, and from the ACE-formatted IRDFF v1.05 library). These results are then compared to two different sets of measured reaction rates. The comparison agrees in an overall sense within 2% and on a specific reaction- and dosimetry location-basis within 5%. Except for the neptunium dosimetry, the individual foil raw calculation-to-experiment comparisons usually agree within 10% but is typically greater than unity. Finally, in the course of developing these calculations, geometry that has previously not been completely specified is provided herein for the convenience of future analysts.« less
Characterizing a four-qubit planar lattice for arbitrary error detection
NASA Astrophysics Data System (ADS)
Chow, Jerry M.; Srinivasan, Srikanth J.; Magesan, Easwar; Córcoles, A. D.; Abraham, David W.; Gambetta, Jay M.; Steffen, Matthias
2015-05-01
Quantum error correction will be a necessary component towards realizing scalable quantum computers with physical qubits. Theoretically, it is possible to perform arbitrarily long computations if the error rate is below a threshold value. The two-dimensional surface code permits relatively high fault-tolerant thresholds at the ~1% level, and only requires a latticed network of qubits with nearest-neighbor interactions. Superconducting qubits have continued to steadily improve in coherence, gate, and readout fidelities, to become a leading candidate for implementation into larger quantum networks. Here we describe characterization experiments and calibration of a system of four superconducting qubits arranged in a planar lattice, amenable to the surface code. Insights into the particular qubit design and comparison between simulated parameters and experimentally determined parameters are given. Single- and two-qubit gate tune-up procedures are described and results for simultaneously benchmarking pairs of two-qubit gates are given. All controls are eventually used for an arbitrary error detection protocol described in separate work [Corcoles et al., Nature Communications, 6, 2015].
Progress towards Low Energy Neutrino Spectroscopy (LENS)
NASA Astrophysics Data System (ADS)
Blackmon, Jeff
2011-10-01
The Low-Energy Neutrino Spectroscopy (LENS) experiment will precisely measure the energy spectrum of low-energy solar neutrinos via charged-current neutrino reactions on indium. LENS will test solar physics through the fundamental equality of the neutrino fluxes and the precisely known solar luminosity in photons, will probe the metallicity of the solar core through the CNO neutrino fluxes, and will test for the existence of mass-varying neutrinos. The LENS detector concept applies indium-loaded scintillator in an optically-segmented lattice geometry to achieve precise time and spatial resolution and unprecedented sensitivity for low-energy neutrino events. The LENS collaboration is currently developing a prototype, miniLENS, in the Kimballton Underground Research Facility (KURF). The miniLENS program aims to demonstrate the performance and selectivity of the technology and to benchmark Monte Carlo simulations that will guide scaling to the full LENS instrument. We will present the motivation and concept for LENS and will provide an overview of the R&D efforts currently centered around miniLENS at KURF.
Proceedings of the 2004 NASA/ONR Circulation Control Workshop, Part 1
NASA Technical Reports Server (NTRS)
Jones, Gregory S. (Editor); Joslin, Ronald D. (Editor)
2005-01-01
As technological advances influence the efficiency and effectiveness of aerodynamic and hydrodynamic applications, designs and operations, this workshop was intended to address the technologies, systems, challenges and successes specific to Coanda driven circulation control in aerodynamics and hydrodynamics. A major goal of this workshop was to determine the 2004 state-of-the-art in circulation control and understand the roadblocks to its application. The workshop addressed applications, CFD, and experiments related to circulation control, emphasizing fundamental physics, systems analysis, and applied research. The workshop consisted of 34 single session oral presentations and written papers that focused on Naval hydrodynamic vehicles (e.g. submarines), Fixed Wing Aviation, V/STOL platforms, propulsion systems (including wind turbine systems), ground vehicles (automotive and trucks) and miscellaneous applications (e.g., poultry exhaust systems and vacuum systems). Several advanced CFD codes were benchmarked using a two-dimensional NCCR circulation control airfoil. The CFD efforts highlighted inconsistencies in turbulence modeling, separation and performance predictions.
A 2D Array of 100's of Ions for Quantum Simulation and Many-Body Physics in a Penning Trap
NASA Astrophysics Data System (ADS)
Bohnet, Justin; Sawyer, Brian; Britton, Joseph; Bollinger, John
2015-05-01
Quantum simulations promise to reveal new materials and phenomena for experimental study, but few systems have demonstrated the capability to control ensembles in which quantum effects cannot be directly computed. One possible platform for intractable quantum simulations may be a system of 100's of 9Be+ ions in a Penning trap, where the valence electron spins are coupled with an effective Ising interaction in a 2D geometry. Here we report on results from a new Penning trap designed for 2D quantum simulations. We characterize the ion crystal stability and describe progress towards bench-marking quantum effects of the spin-spin coupling using a spin-squeezing witness. We also report on the successful photodissociation of BeH+ contaminant molecular ions that impede the use of such crystals for quantum simulation. This work lays the foundation for future experiments such as the observation of spin dynamics under the quantum Ising Hamiltonian with a transverse field. Supported by a NIST-NRC Research Associateship.
Aerothermodynamic testing requirements for future space transportation systems
NASA Technical Reports Server (NTRS)
Paulson, John W., Jr.; Miller, Charles G., III
1995-01-01
Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamic and physical processes, is the genesis for the design and development of advanced space transportation vehicles. It provides crucial information to other disciplines involved in the development process such as structures, materials, propulsion, and avionics. Sources of aerothermodynamic information include ground-based facilities, computational fluid dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this triad is required to provide the optimum requirements while reducing undue design conservatism, risk, and cost. This paper discusses the role of ground-based facilities in the design of future space transportation system concepts. Testing methodology is addressed, including the iterative approach often required for the assessment and optimization of configurations from an aerothermodynamic perspective. The influence of vehicle shape and the transition from parametric studies for optimization to benchmark studies for final design and establishment of the flight data book is discussed. Future aerothermodynamic testing requirements including the need for new facilities are also presented.
Strongly Interacting Fermi Gases: Non-Equilibrium Dynamics and Dimensional Crossover
NASA Astrophysics Data System (ADS)
Sommer, Ariel
2015-05-01
Strongly interacting atomic Fermi gases near Feshbach resonances give access to a rich variety of phenomena in many-fermion physics and superfluidity. This flexible and microscopically well-characterized system provides a pristine platform in which to benchmark many-body theories. I will describe three experiments on gases of fermionic 6Li atoms. In the first experiment, we study spin transport in the return to equilibrium after a spin excitation. From the dynamics near equilibrium, we obtain spin transport coefficients over a range of temperatures and interaction strengths, and observe quantum-limited spin diffusion at unitarity. In separate experiments, we study the effect of dimensionality on the binding of pairs of fermions. We tune the system from three to two dimensions by adjusting the strength of a one-dimensional optical lattice, and measure the binding energy of fermion pairs using radio-frequency spectroscopy. In a third set of experiments, we study nonlinear excitations of a fermionic superfluid. Imprinting a phase jump on the superfluid order parameter causes a long-lived, localized density depletion that oscillates through the cloud. We measure the oscillation period and find that it corresponds to an emergent particle with an effective mass of up to several hundred times the bare mass. This excitation has been identified as a solitonic vortex that results from the decay of a planar soliton. This work was performed at the Massachusetts Institute of Technology under the supervision of Prof. Martin Zwierlein.
NASA Astrophysics Data System (ADS)
Krak, Michael D.; Dreyer, Jason T.; Singh, Rajendra
2016-03-01
A vehicle clutch damper is intentionally designed to contain multiple discontinuous non-linearities, such as multi-staged springs, clearances, pre-loads, and multi-staged friction elements. The main purpose of this practical torsional device is to transmit a wide range of torque while isolating torsional vibration between an engine and transmission. Improved understanding of the dynamic behavior of the device could be facilitated by laboratory measurement, and thus a refined vibratory experiment is proposed. The experiment is conceptually described as a single degree of freedom non-linear torsional system that is excited by an external step torque. The single torsional inertia (consisting of a shaft and torsion arm) is coupled to ground through parallel production clutch dampers, which are characterized by quasi-static measurements provided by the manufacturer. Other experimental objectives address physical dimensions, system actuation, flexural modes, instrumentation, and signal processing issues. Typical measurements show that the step response of the device is characterized by three distinct non-linear regimes (double-sided impact, single-sided impact, and no-impact). Each regime is directly related to the non-linear features of the device and can be described by peak angular acceleration values. Predictions of a simplified single degree of freedom non-linear model verify that the experiment performs well and as designed. Accordingly, the benchmark measurements could be utilized to validate non-linear models and simulation codes, as well as characterize dynamic parameters of the device including its dissipative properties.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
Using SPARK as a Solver for Modelica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Wetter, Michael; Haves, Philip
Modelica is an object-oriented acausal modeling language that is well positioned to become a de-facto standard for expressing models of complex physical systems. To simulate a model expressed in Modelica, it needs to be translated into executable code. For generating run-time efficient code, such a translation needs to employ algebraic formula manipulations. As the SPARK solver has been shown to be competitive for generating such code but currently cannot be used with the Modelica language, we report in this paper how SPARK's symbolic and numerical algorithms can be implemented in OpenModelica, an open-source implementation of a Modelica modeling and simulationmore » environment. We also report benchmark results that show that for our air flow network simulation benchmark, the SPARK solver is competitive with Dymola, which is believed to provide the best solver for Modelica.« less
Radiochemical analyses of surface water from U.S. Geological Survey hydrologic bench-mark stations
Janzer, V.J.; Saindon, L.G.
1972-01-01
The U.S. Geological Survey's program for collecting and analyzing surface-water samples for radiochemical constituents at hydrologic bench-mark stations is described. Analytical methods used during the study are described briefly and data obtained from 55 of the network stations in the United States during the period from 1967 to 1971 are given in tabular form.Concentration values are reported for dissolved uranium, radium, gross alpha and gross beta radioactivity. Values are also given for suspended gross alpha radioactivity in terms of natural uranium. Suspended gross beta radioactivity is expressed both as the equilibrium mixture of strontium-90/yttrium-90 and as cesium-137.Other physical parameters reported which describe the samples include the concentrations of dissolved and suspended solids, the water temperature and stream discharge at the time of the sample collection.
Diffusion-based recommendation with trust relations on tripartite graphs
NASA Astrophysics Data System (ADS)
Wang, Ximeng; Liu, Yun; Zhang, Guangquan; Xiong, Fei; Lu, Jie
2017-08-01
The diffusion-based recommendation approach is a vital branch in recommender systems, which successfully applies physical dynamics to make recommendations for users on bipartite or tripartite graphs. Trust links indicate users’ social relations and can provide the benefit of reducing data sparsity. However, traditional diffusion-based algorithms only consider rating links when making recommendations. In this paper, the complementarity of users’ implicit and explicit trust is exploited, and a novel resource-allocation strategy is proposed, which integrates these two kinds of trust relations on tripartite graphs. Through empirical studies on three benchmark datasets, our proposed method obtains better performance than most of the benchmark algorithms in terms of accuracy, diversity and novelty. According to the experimental results, our method is an effective and reasonable way to integrate additional features into the diffusion-based recommendation approach.
CERN Computing in Commercial Clouds
NASA Astrophysics Data System (ADS)
Cordeiro, C.; Field, L.; Garrido Bear, B.; Giordano, D.; Jones, B.; Keeble, O.; Manzi, A.; Martelli, E.; McCance, G.; Moreno-García, D.; Traylen, S.
2017-10-01
By the end of 2016 more than 10 Million core-hours of computing resources have been delivered by several commercial cloud providers to the four LHC experiments to run their production workloads, from simulation to full chain processing. In this paper we describe the experience gained at CERN in procuring and exploiting commercial cloud resources for the computing needs of the LHC experiments. The mechanisms used for provisioning, monitoring, accounting, alarming and benchmarking will be discussed, as well as the involvement of the LHC collaborations in terms of managing the workflows of the experiments within a multicloud environment.
Nano-Transistor Modeling: Two Dimensional Green's Function Method
NASA Technical Reports Server (NTRS)
Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, Bryan
2001-01-01
Two quantum mechanical effects that impact the operation of nanoscale transistors are inversion layer energy quantization and ballistic transport. While the qualitative effects of these features are reasonably understood, a comprehensive study of device physics in two dimensions is lacking. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL (Drain Induced Barrier Lowering), and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density-gradient and quantum-corrected MEDICI).
Will Allis Prize Talk: Electron Collisions - Experiment, Theory and Applications
NASA Astrophysics Data System (ADS)
Bartschat, Klaus
2016-05-01
Electron collisions with atoms, ions, and molecules represent one of the very early topics of quantum mechanics. In spite of the field's maturity, a number of recent developments in detector technology (e.g., the ``reaction microscope'' or the ``magnetic-angle changer'') and the rapid increase in computational resources have resulted in significant progress in the measurement, understanding, and theoretical/computational description of few-body Coulomb problems. Close collaborations between experimentalists and theorists worldwide continue to produce high-quality benchmark data, which allow for thoroughly testing and further developing a variety of theoretical approaches. As a result, it has now become possible to reliably calculate the vast amount of atomic data needed for detailed modelling of the physics and chemistry of planetary atmospheres, the interpretation of astrophysical data, optimizing the energy transport in reactive plasmas, and many other topics - including light-driven processes, in which electrons are produced by continuous or short-pulse ultra-intense electromagnetic radiation. In this talk, I will highlight some of the recent developments that have had a major impact on the field. This will be followed by showcasing examples, in which accurate electron collision data enabled applications in fields beyond traditional AMO physics. Finally, open problems and challenges for the future will be outlined. I am very grateful for fruitful scientific collaborations with many colleagues, and the long-term financial support by the NSF through the Theoretical AMO and Computational Physics programs, as well as supercomputer resources through TeraGrid and XSEDE.
NASA Astrophysics Data System (ADS)
Bartschat, Klaus
2016-09-01
Electron collisions with atoms, ions, and molecules represent one of the very early topics of quantum mechanics. In spite of the field's maturity, a number of recent developments in detector technology (e.g., the ``reaction microscope'' or the ``magnetic-angle changer'') and the rapid increase in computational resources have resulted in significant progress in the measurement, understanding, and theoretical/computational description of few-body Coulomb problems. Close collaborations between experimentalists and theorists worldwide continue to produce high-quality benchmark data, which allow for thoroughly testing and further developing a variety of theoretical approaches. As a result, it has now become possible to reliably calculate the vast amount of atomic data needed for detailed modelling of the physics and chemistry of planetary atmospheres, the interpretation of astrophysical data, optimizing the energy transport in reactive plasmas, and many other topics - including light-driven processes, in which electrons are produced by continuous or short-pulse ultra-intense electromagnetic radiation. I will highlight some of the recent developments that have had a major impact on the field. This will be followed by showcasing examples, in which accurate electron collision data enabled applications in fields beyond traditional AMO physics. Finally, open problems and challenges for the future will be outlined. I am very grateful for fruitful scientific collaborations with many colleagues, and the long-term financial support by the NSF through the Theoretical AMO and Computational Physics programs, as well as supercomputer resources through TeraGrid and XSEDE.
TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science
NASA Astrophysics Data System (ADS)
Wilson, C. R.; Spiegelman, M.; van Keken, P.
2012-12-01
Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are generated from this file but share an infrastructure for services common to all models, e.g. diagnostics, checkpointing and global non-linear convergence monitoring. This maximizes code reusability, reliability and longevity ensuring that scientific results and the methods used to acquire them are transparent and reproducible. TerraFERMA has been tested against many published geodynamic benchmarks including 2D/3D thermal convection problems, the subduction zone benchmarks and benchmarks for magmatic solitary waves. It is currently being used in the investigation of reactive cracking phenomena with applications to carbon sequestration, but we will principally discuss its use in modeling the migration of fluids in subduction zones. Subduction zones require an understanding of the highly nonlinear interactions of fluids with solids and thus provide an excellent scientific driver for the development of multi-physics software.
Student Learning: Education's Field of Dreams.
ERIC Educational Resources Information Center
Blackwell, Peggy L.
2003-01-01
Discusses seven research-based benchmarks providing a framework for the student-learning-focused reform of teacher education: knowledge and understanding based on previous experience, usable content knowledge, transfer of learning/the learning context, strategic thinking, motivation and affect, development and individual differences, and standards…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oterkus, Selda; Madenci, Erdogan, E-mail: madenci@email.arizona.edu; Agwai, Abigail
This study presents the derivation of ordinary state-based peridynamic heat conduction equation based on the Lagrangian formalism. The peridynamic heat conduction parameters are related to those of the classical theory. An explicit time stepping scheme is adopted for numerical solution of various benchmark problems with known solutions. It paves the way for applying the peridynamic theory to other physical fields such as neutronic diffusion and electrical potential distribution.
Validation of Tendril TrueHome Using Software-to-Software Comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan
This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.
Podar, Mircea; Shakya, Migun; D'Amore, Rosalinda; ...
2016-01-14
In the last 5 years, the rapid pace of innovations and improvements in sequencing technologies has completely changed the landscape of metagenomic and metagenetic experiments. Therefore, it is critical to benchmark the various methodologies for interrogating the composition of microbial communities, so that we can assess their strengths and limitations. Here, the most common phylogenetic marker for microbial community diversity studies is the 16S ribosomal RNA gene and in the last 10 years the field has moved from sequencing a small number of amplicons and samples to more complex studies where thousands of samples and multiple different gene regions aremore » interrogated.« less
A Benchmark Problem for Development of Autonomous Structural Modal Identification
NASA Technical Reports Server (NTRS)
Pappa, Richard S.; Woodard, Stanley E.; Juang, Jer-Nan
1996-01-01
This paper summarizes modal identification results obtained using an autonomous version of the Eigensystem Realization Algorithm on a dynamically complex, laboratory structure. The benchmark problem uses 48 of 768 free-decay responses measured in a complete modal survey test. The true modal parameters of the structure are well known from two previous, independent investigations. Without user involvement, the autonomous data analysis identified 24 to 33 structural modes with good to excellent accuracy in 62 seconds of CPU time (on a DEC Alpha 4000 computer). The modal identification technique described in the paper is the baseline algorithm for NASA's Autonomous Dynamics Determination (ADD) experiment scheduled to fly on International Space Station assembly flights in 1997-1999.
Benchmarking and performance analysis of the CM-2. [SIMD computer
NASA Technical Reports Server (NTRS)
Myers, David W.; Adams, George B., II
1988-01-01
A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.
Reactor Testing and Qualification: Prioritized High-level Criticality Testing Needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Bragg-Sitton; J. Bess; J. Werner
2011-09-01
Researchers at the Idaho National Laboratory (INL) were tasked with reviewing possible criticality testing needs to support development of the fission surface power system reactor design. Reactor physics testing can provide significant information to aid in development of technologies associated with small, fast spectrum reactors that could be applied for non-terrestrial power systems, leading to eventual system qualification. Several studies have been conducted in recent years to assess the data and analyses required to design and build a space fission power system with high confidence that the system will perform as designed [Marcille, 2004a, 2004b; Weaver, 2007; Parry et al.,more » 2008]. This report will provide a summary of previous critical tests and physics measurements that are potentially applicable to the current reactor design (both those that have been benchmarked and those not yet benchmarked), summarize recent studies of potential nuclear testing needs for space reactor development and their applicability to the current baseline fission surface power (FSP) system design, and provide an overview of a suite of tests (separate effects, sub-critical or critical) that could fill in the information database to improve the accuracy of physics modeling efforts as the FSP design is refined. Some recommendations for tasks that could be completed in the near term are also included. Specific recommendations on critical test configurations will be reserved until after the sensitivity analyses being conducted by Los Alamos National Laboratory (LANL) are completed (due August 2011).« less
Coupled two-dimensional edge plasma and neutral gas modeling of tokamak scrape-off-layers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maingi, Rajesh
1992-08-01
The objective of this study is to devise a detailed description of the tokamak scrape-off-layer (SOL), which includes the best available models of both the plasma and neutral species and the strong coupling between the two in many SOL regimes. A good estimate of both particle flux and heat flux profiles at the limiter/divertor target plates is desired. Peak heat flux is one of the limiting factors in determining the survival probability of plasma-facing-components at high power levels. Plate particle flux affects the neutral flux to the pump, which determines the particle exhaust rate. A technique which couples a two-dimensionalmore » (2-D) plasma and a 2-D neutral transport code has been developed (coupled code technique), but this procedure requires large amounts of computer time. Relevant physics has been added to an existing two-neutral-species model which takes the SOL plasma/neutral coupling into account in a simple manner (molecular physics model), and this model is compared with the coupled code technique mentioned above. The molecular physics model is benchmarked against experimental data from a divertor tokamak (DIII-D), and a similar model (single-species model) is benchmarked against data from a pump-limiter tokamak (Tore Supra). The models are then used to examine two key issues: free-streaming-limits (ion energy conduction and momentum flux) and the effects of the non-orthogonal geometry of magnetic flux surfaces and target plates on edge plasma parameter profiles.« less
Madi, Mahmoud K; Karameh, Fadi N
2018-05-11
Many physical models of biological processes including neural systems are characterized by parametric nonlinear dynamical relations between driving inputs, internal states, and measured outputs of the process. Fitting such models using experimental data (data assimilation) is a challenging task since the physical process often operates in a noisy, possibly non-stationary environment; moreover, conducting multiple experiments under controlled and repeatable conditions can be impractical, time consuming or costly. The accuracy of model identification, therefore, is dictated principally by the quality and dynamic richness of collected data over single or few experimental sessions. Accordingly, it is highly desirable to design efficient experiments that, by exciting the physical process with smart inputs, yields fast convergence and increased accuracy of the model. We herein introduce an adaptive framework in which optimal input design is integrated with Square root Cubature Kalman Filters (OID-SCKF) to develop an online estimation procedure that first, converges significantly quicker, thereby permitting model fitting over shorter time windows, and second, enhances model accuracy when only few process outputs are accessible. The methodology is demonstrated on common nonlinear models and on a four-area neural mass model with noisy and limited measurements. Estimation quality (speed and accuracy) is benchmarked against high-performance SCKF-based methods that commonly employ dynamically rich informed inputs for accurate model identification. For all the tested models, simulated single-trial and ensemble averages showed that OID-SCKF exhibited (i) faster convergence of parameter estimates and (ii) lower dependence on inter-trial noise variability with gains up to around 1000 msec in speed and 81% increase in variability for the neural mass models. In terms of accuracy, OID-SCKF estimation was superior, and exhibited considerably less variability across experiments, in identifying model parameters of (a) systems with challenging model inversion dynamics and (b) systems with fewer measurable outputs that directly relate to the underlying processes. Fast and accurate identification therefore carries particular promise for modeling of transient (short-lived) neuronal network dynamics using a spatially under-sampled set of noisy measurements, as is commonly encountered in neural engineering applications. © 2018 IOP Publishing Ltd.
Little Boy replication: justification and construction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malenfant, R.E.
A reconstruction of the Little Boy weapon allowed experiments to evaluate yield, leakage measurements for comparison with calculations, and phenomenological measurements to evaluate various in-situ dosimeters. The reconstructed weapon was operated at sustained delayed critical at the Los Alamos Critical Assembly Facility. The present experiments provide a wealth of information to benchmark calculations and demonstrate that the 1965 measurements on the Ichiban assembly (a spherical mockup of Little Boy) were in error.
Benchmarking government action for obesity prevention--an innovative advocacy strategy.
Martin, J; Peeters, A; Honisett, S; Mavoa, H; Swinburn, B; de Silva-Sanigorski, A
2014-01-01
Successful obesity prevention will require a leading role for governments, but internationally they have been slow to act. League tables of benchmark indicators of action can be a valuable advocacy and evaluation tool. To develop a benchmarking tool for government action on obesity prevention, implement it across Australian jurisdictions and to publicly award the best and worst performers. A framework was developed which encompassed nine domains, reflecting best practice government action on obesity prevention: whole-of-government approaches; marketing restrictions; access to affordable, healthy food; school food and physical activity; food in public facilities; urban design and transport; leisure and local environments; health services, and; social marketing. A scoring system was used by non-government key informants to rate the performance of their government. National rankings were generated and the results were communicated to all Premiers/Chief Ministers, the media and the national obesity research and practice community. Evaluation of the initial tool in 2010 showed it to be feasible to implement and able to discriminate the better and worse performing governments. Evaluation of the rubric in 2011 confirmed this to be a robust and useful method. In relation to government action, the best performing governments were those with whole-of-government approaches, had extended common initiatives and demonstrated innovation and strong political will. This new benchmarking tool, the Obesity Action Award, has enabled identification of leading government action on obesity prevention and the key characteristics associated with their success. We recommend this tool for other multi-state/country comparisons. Copyright © 2013 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.
Rethinking key–value store for parallel I/O optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kougkas, Anthony; Eslami, Hassan; Sun, Xian-He
2015-01-26
Key-value stores are being widely used as the storage system for large-scale internet services and cloud storage systems. However, they are rarely used in HPC systems, where parallel file systems are the dominant storage solution. In this study, we examine the architecture differences and performance characteristics of parallel file systems and key-value stores. We propose using key-value stores to optimize overall Input/Output (I/O) performance, especially for workloads that parallel file systems cannot handle well, such as the cases with intense data synchronization or heavy metadata operations. We conducted experiments with several synthetic benchmarks, an I/O benchmark, and a real application.more » We modeled the performance of these two systems using collected data from our experiments, and we provide a predictive method to identify which system offers better I/O performance given a specific workload. The results show that we can optimize the I/O performance in HPC systems by utilizing key-value stores.« less
Investigation of the transient fuel preburner manifold and combustor
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Chen, Yen-Sen; Farmer, Richard C.
1989-01-01
A computational fluid dynamics (CFD) model with finite rate reactions, FDNS, was developed to study the start transient of the Space Shuttle Main Engine (SSME) fuel preburner (FPB). FDNS is a time accurate, pressure based CFD code. An upwind scheme was employed for spatial discretization. The upwind scheme was based on second and fourth order central differencing with adaptive artificial dissipation. A state of the art two-equation k-epsilon (T) turbulence model was employed for the turbulence calculation. A Pade' Rational Solution (PARASOL) chemistry algorithm was coupled with the point implicit procedure. FDNS was benchmarked with three well documented experiments: a confined swirling coaxial jet, a non-reactive ramjet dump combustor, and a reactive ramjet dump combustor. Excellent comparisons were obtained for the benchmark cases. The code was then used to study the start transient of an axisymmetric SSME fuel preburner. Predicted transient operation of the preburner agrees well with experiment. Furthermore, it was also found that an appreciable amount of unburned oxygen entered the turbine stages.
Benchmarking reference services: step by step.
Buchanan, H S; Marshall, J G
1996-01-01
This article is a companion to an introductory article on benchmarking published in an earlier issue of Medical Reference Services Quarterly. Librarians interested in benchmarking often ask the following questions: How do I determine what to benchmark; how do I form a benchmarking team; how do I identify benchmarking partners; what's the best way to collect and analyze benchmarking information; and what will I do with the data? Careful planning is a critical success factor of any benchmarking project, and these questions must be answered before embarking on a benchmarking study. This article summarizes the steps necessary to conduct benchmarking research. Relevant examples of each benchmarking step are provided.
New Models and Methods for the Electroweak Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Linda
2017-09-26
This is the Final Technical Report to the US Department of Energy for grant DE-SC0013529, New Models and Methods for the Electroweak Scale, covering the time period April 1, 2015 to March 31, 2017. The goal of this project was to maximize the understanding of fundamental weak scale physics in light of current experiments, mainly the ongoing run of the Large Hadron Collider and the space based satellite experiements searching for signals Dark Matter annihilation or decay. This research program focused on the phenomenology of supersymmetry, Higgs physics, and Dark Matter. The properties of the Higgs boson are currently beingmore » measured by the Large Hadron collider, and could be a sensitive window into new physics at the weak scale. Supersymmetry is the leading theoretical candidate to explain the natural nessof the electroweak theory, however new model space must be explored as the Large Hadron collider has disfavored much minimal model parameter space. In addition the nature of Dark Matter, the mysterious particle that makes up 25% of the mass of the universe is still unknown. This project sought to address measurements of the Higgs boson couplings to the Standard Model particles, new LHC discovery scenarios for supersymmetric particles, and new measurements of Dark Matter interactions with the Standard Model both in collider production and annihilation in space. Accomplishments include new creating tools for analyses of Dark Matter models in Dark Matter which annihilates into multiple Standard Model particles, including new visualizations of bounds for models with various Dark Matter branching ratios; benchmark studies for new discovery scenarios of Dark Matter at the Large Hardon Collider for Higgs-Dark Matter and gauge boson-Dark Matter interactions; New target analyses to detect direct decays of the Higgs boson into challenging final states like pairs of light jets, and new phenomenological analysis of non-minimal supersymmetric models, namely the set of Dirac Gaugino Models.« less
NASA Astrophysics Data System (ADS)
Sergeev, D. A.; Kandaurov, A. A.; Troitskaya, Yu I.
2017-11-01
In this paper we describe PIV-system specially designed for the study of the hydrophysical processes in large-scale benchmark setup of promising fast reactor. The system allows the PIV-measurements for the conditions of complicated configuration of the reactor benchmark, reflections and distortions section of the laser sheet, blackout, in the closed volume. The use of filtering techniques and method of masks images enabled us to reduce the number of incorrect measurement of flow velocity vectors by an order. The method of conversion of image coordinates and velocity field in the reference model of the reactor using a virtual 3D simulation targets, without loss of accuracy in comparison with a method of using physical objects in filming area was released. The results of measurements of velocity fields in various modes, both stationary (workers), as well as in non-stationary (emergency).
Physical properties of the benchmark models program supercritical wing
NASA Technical Reports Server (NTRS)
Dansberry, Bryan E.; Durham, Michael H.; Bennett, Robert M.; Turnock, David L.; Silva, Walter A.; Rivera, Jose A., Jr.
1993-01-01
The goal of the Benchmark Models Program is to provide data useful in the development and evaluation of aeroelastic computational fluid dynamics (CFD) codes. To that end, a series of three similar wing models are being flutter tested in the Langley Transonic Dynamics Tunnel. These models are designed to simultaneously acquire model response data and unsteady surface pressure data during wing flutter conditions. The supercritical wing is the second model of this series. It is a rigid semispan model with a rectangular planform and a NASA SC(2)-0414 supercritical airfoil shape. The supercritical wing model was flutter tested on a flexible mount, called the Pitch and Plunge Apparatus, that provides a well-defined, two-degree-of-freedom dynamic system. The supercritical wing model and associated flutter test apparatus is described and experimentally determined wind-off structural dynamic characteristics of the combined rigid model and flexible mount system are included.
Benchmark Shock Tube Experiments for Radiative Heating Relevant to Earth Re-Entry
NASA Technical Reports Server (NTRS)
Brandis, A. M.; Cruden, B. A.
2017-01-01
Detailed spectrally and spatially resolved radiance has been measured in the Electric Arc Shock Tube (EAST) facility for conditions relevant to high speed entry into a variety of atmospheres, including Earth, Venus, Titan, Mars and the Outer Planets. The tests that measured radiation relevant for Earth re-entry are the focus of this work and are taken from campaigns 47, 50, 52 and 57. These tests covered conditions from 8 km/s to 15.5 km/s at initial pressures ranging from 0.05 Torr to 1 Torr, of which shots at 0.1 and 0.2 Torr are analyzed in this paper. These conditions cover a range of points of interest for potential fight missions, including return from Low Earth Orbit, the Moon and Mars. The large volume of testing available from EAST is useful for statistical analysis of radiation data, but is problematic for identifying representative experiments for performing detailed analysis. Therefore, the intent of this paper is to select a subset of benchmark test data that can be considered for further detailed study. These benchmark shots are intended to provide more accessible data sets for future code validation studies and facility-to-facility comparisons. The shots that have been selected as benchmark data are the ones in closest agreement to a line of best fit through all of the EAST results, whilst also showing the best experimental characteristics, such as test time and convergence to equilibrium. The EAST data are presented in different formats for analysis. These data include the spectral radiance at equilibrium, the spatial dependence of radiance over defined wavelength ranges and the mean non-equilibrium spectral radiance (so-called 'spectral non-equilibrium metric'). All the information needed to simulate each experimental trace, including free-stream conditions, shock time of arrival (i.e. x-t) relation, and the spectral and spatial resolution functions, are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, J.M.; Wiarda, D.; Miller, T.M.
2011-07-01
The U.S. Nuclear Regulatory Commission's Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the evaluated nuclear data file (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI.3 data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII.0. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96 libraries.more » Verification and validation of the new libraries were accomplished using diagnostic checks in AMPX, 'unit tests' for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in RPV fluence calculations and meet the calculational uncertainty criterion in Regulatory Guide 1.190. (authors)« less
NASA Astrophysics Data System (ADS)
Rodriguez, Tony F.; Cushman, David A.
2003-06-01
With the growing commercialization of watermarking techniques in various application scenarios it has become increasingly important to quantify the performance of watermarking products. The quantification of relative merits of various products is not only essential in enabling further adoption of the technology by society as a whole, but will also drive the industry to develop testing plans/methodologies to ensure quality and minimize cost (to both vendors & customers.) While the research community understands the theoretical need for a publicly available benchmarking system to quantify performance, there has been less discussion on the practical application of these systems. By providing a standard set of acceptance criteria, benchmarking systems can dramatically increase the quality of a particular watermarking solution, validating the product performances if they are used efficiently and frequently during the design process. In this paper we describe how to leverage specific design of experiments techniques to increase the quality of a watermarking scheme, to be used with the benchmark tools being developed by the Ad-Hoc Watermark Verification Group. A Taguchi Loss Function is proposed for an application and orthogonal arrays used to isolate optimal levels for a multi-factor experimental situation. Finally, the results are generalized to a population of cover works and validated through an exhaustive test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Wiarda, Dorothea; Miller, Thomas Martin
2011-01-01
The U.S. Nuclear Regulatory Commission s Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the Evaluated Nuclear Data File (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96more » libraries. Verification and validation of the new libraries was accomplished using diagnostic checks in AMPX, unit tests for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in LWR shielding applications, and meet the calculational uncertainty criterion in Regulatory Guide 1.190.« less
Learning Communities: An Untapped Sustainable Competitive Advantage for Higher Education
ERIC Educational Resources Information Center
Dawson, Shane; Burnett, Bruce; O' Donohue, Mark
2006-01-01
Purpose: This paper demonstrates the need for the higher education sector to develop and implement scaleable, quantitative measures that evaluate community and establish organisational benchmarks in order to guide the development of future practices designed to enhance the student learning experience. Design/methodology/approach: Literature…
Desperately Seeking Standards: Bridging the Gap from Concept to Reality.
ERIC Educational Resources Information Center
Jones, A. James; Gardner, Carrie; Zaenglein, Judith L.
1998-01-01
Discussion of national standards for information-and-technology literacy focuses on experiences at one school where national standards were synthesized by library media specialists to develop local standards as well as a series of benchmarks by which student achievement could be measured. (Author/LRW)
Regalia, Giulia; Biffi, Emilia; Ferrigno, Giancarlo; Pedrocchi, Alessandra
2015-01-01
The collection of good quality extracellular neuronal spikes from neuronal cultures coupled to Microelectrode Arrays (MEAs) is a binding requirement to gather reliable data. Due to physical constraints, low power requirement, or the need of customizability, commercial recording platforms are not fully adequate for the development of experimental setups integrating MEA technology with other equipment needed to perform experiments under climate controlled conditions, like environmental chambers or cell culture incubators. To address this issue, we developed a custom MEA interfacing system featuring low noise, low power, and the capability to be readily integrated inside an incubator-like environment. Two stages, a preamplifier and a filter amplifier, were designed, implemented on printed circuit boards, and tested. The system is characterized by a low input-referred noise (<1 μV RMS), a high channel separation (>70 dB), and signal-to-noise ratio values of neuronal recordings comparable to those obtained with the benchmark commercial MEA system. In addition, the system was successfully integrated with an environmental MEA chamber, without harming cell cultures during experiments and without being damaged by the high humidity level. The devised system is of practical value in the development of in vitro platforms to study temporally extended neuronal network dynamics by means of MEAs. PMID:25977683
Regalia, Giulia; Biffi, Emilia; Ferrigno, Giancarlo; Pedrocchi, Alessandra
2015-01-01
The collection of good quality extracellular neuronal spikes from neuronal cultures coupled to Microelectrode Arrays (MEAs) is a binding requirement to gather reliable data. Due to physical constraints, low power requirement, or the need of customizability, commercial recording platforms are not fully adequate for the development of experimental setups integrating MEA technology with other equipment needed to perform experiments under climate controlled conditions, like environmental chambers or cell culture incubators. To address this issue, we developed a custom MEA interfacing system featuring low noise, low power, and the capability to be readily integrated inside an incubator-like environment. Two stages, a preamplifier and a filter amplifier, were designed, implemented on printed circuit boards, and tested. The system is characterized by a low input-referred noise (<1 μV RMS), a high channel separation (>70 dB), and signal-to-noise ratio values of neuronal recordings comparable to those obtained with the benchmark commercial MEA system. In addition, the system was successfully integrated with an environmental MEA chamber, without harming cell cultures during experiments and without being damaged by the high humidity level. The devised system is of practical value in the development of in vitro platforms to study temporally extended neuronal network dynamics by means of MEAs.
Pore-scale and continuum simulations of solute transport micromodel benchmark experiments
Oostrom, M.; Mehmani, Y.; Romero-Gomez, P.; ...
2014-06-18
Four sets of nonreactive solute transport experiments were conducted with micromodels. Three experiments with one variable, i.e., flow velocity, grain diameter, pore-aspect ratio, and flow-focusing heterogeneity were in each set. The data sets were offered to pore-scale modeling groups to test their numerical simulators. Each set consisted of two learning experiments, for which our results were made available, and one challenge experiment, for which only the experimental description and base input parameters were provided. The experimental results showed a nonlinear dependence of the transverse dispersion coefficient on the Peclet number, a negligible effect of the pore-aspect ratio on transverse mixing,more » and considerably enhanced mixing due to flow focusing. Five pore-scale models and one continuum-scale model were used to simulate the experiments. Of the pore-scale models, two used a pore-network (PN) method, two others are based on a lattice Boltzmann (LB) approach, and one used a computational fluid dynamics (CFD) technique. Furthermore, we used the learning experiments, by the PN models, to modify the standard perfect mixing approach in pore bodies into approaches to simulate the observed incomplete mixing. The LB and CFD models used the learning experiments to appropriately discretize the spatial grid representations. For the continuum modeling, the required dispersivity input values were estimated based on published nonlinear relations between transverse dispersion coefficients and Peclet number. Comparisons between experimental and numerical results for the four challenge experiments show that all pore-scale models were all able to satisfactorily simulate the experiments. The continuum model underestimated the required dispersivity values, resulting in reduced dispersion. The PN models were able to complete the simulations in a few minutes, whereas the direct models, which account for the micromodel geometry and underlying flow and transport physics, needed up to several days on supercomputers to resolve the more complex problems.« less
NASA Astrophysics Data System (ADS)
Kirstetter, P. E.; Petersen, W. A.; Gourley, J. J.; Kummerow, C. D.; Huffman, G. J.; Turk, J.; Tanelli, S.; Maggioni, V.; Anagnostou, E. N.; Hong, Y.; Schwaller, M.
2016-12-01
Natural gas production via hydraulic fracturing of shale has proliferated on a global scale, yet recovery factors remain low because production strategies are not based on the physics of flow in shale reservoirs. In particular, the physical mechanisms and time scales of depletion from the matrix into the simulated fracture network are not well understood, limiting the potential to optimize operations and reduce environmental impacts. Studying matrix flow is challenging because shale is heterogeneous and has porosity from the μm- to nm-scale. Characterizing nm-scale flow paths requires electron microscopy but the limited field of view does not capture the connectivity and heterogeneity observed at the mm-scale. Therefore, pore-scale models must link to larger volumes to simulate flow on the reservoir-scale. Upscaled models must honor the physics of flow, but at present there is a gap between cm-scale experiments and μm-scale simulations based on ex situ image data. To address this gap, we developed a synchrotron X-ray microscope with an in situ cell to simultaneously visualize and measure flow. We perform coupled flow and microtomography experiments on mm-scale samples from the Barnett, Eagle Ford and Marcellus reservoirs. We measure permeability at various pressures via the pulse-decay method to quantify effective stress dependence and the relative contributions of advective and diffusive mechanisms. Images at each pressure step document how microfractures, interparticle pores, and organic matter change with effective stress. Linking changes in the pore network to flow measurements motivates a physical model for depletion. To directly visualize flow, we measure imbibition rates using inert, high atomic number gases and image periodically with monochromatic beam. By imaging above/below X-ray adsorption edges, we magnify the signal of gas saturation in μm-scale porosity and nm-scale, sub-voxel features. Comparing vacuumed and saturated states yields image-based measurements of the distribution and time scales of imbibition. We also characterize nm-scale structure via focused ion beam tomography to quantify sub-voxel porosity and connectivity. The multi-scale image and flow data is used to develop a framework to upscale and benchmark pore-scale models.
Visualizing and measuring flow in shale matrix using in situ synchrotron X-ray microtomography
NASA Astrophysics Data System (ADS)
Kohli, A. H.; Kiss, A. M.; Kovscek, A. R.; Bargar, J.
2017-12-01
Natural gas production via hydraulic fracturing of shale has proliferated on a global scale, yet recovery factors remain low because production strategies are not based on the physics of flow in shale reservoirs. In particular, the physical mechanisms and time scales of depletion from the matrix into the simulated fracture network are not well understood, limiting the potential to optimize operations and reduce environmental impacts. Studying matrix flow is challenging because shale is heterogeneous and has porosity from the μm- to nm-scale. Characterizing nm-scale flow paths requires electron microscopy but the limited field of view does not capture the connectivity and heterogeneity observed at the mm-scale. Therefore, pore-scale models must link to larger volumes to simulate flow on the reservoir-scale. Upscaled models must honor the physics of flow, but at present there is a gap between cm-scale experiments and μm-scale simulations based on ex situ image data. To address this gap, we developed a synchrotron X-ray microscope with an in situ cell to simultaneously visualize and measure flow. We perform coupled flow and microtomography experiments on mm-scale samples from the Barnett, Eagle Ford and Marcellus reservoirs. We measure permeability at various pressures via the pulse-decay method to quantify effective stress dependence and the relative contributions of advective and diffusive mechanisms. Images at each pressure step document how microfractures, interparticle pores, and organic matter change with effective stress. Linking changes in the pore network to flow measurements motivates a physical model for depletion. To directly visualize flow, we measure imbibition rates using inert, high atomic number gases and image periodically with monochromatic beam. By imaging above/below X-ray adsorption edges, we magnify the signal of gas saturation in μm-scale porosity and nm-scale, sub-voxel features. Comparing vacuumed and saturated states yields image-based measurements of the distribution and time scales of imbibition. We also characterize nm-scale structure via focused ion beam tomography to quantify sub-voxel porosity and connectivity. The multi-scale image and flow data is used to develop a framework to upscale and benchmark pore-scale models.
NASA Astrophysics Data System (ADS)
Gong, K.; Fritsch, D.
2018-05-01
Nowadays, multiple-view stereo satellite imagery has become a valuable data source for digital surface model generation and 3D reconstruction. In 2016, a well-organized multiple view stereo publicly benchmark for commercial satellite imagery has been released by the John Hopkins University Applied Physics Laboratory, USA. This benchmark motivates us to explore the method that can generate accurate digital surface models from a large number of high resolution satellite images. In this paper, we propose a pipeline for processing the benchmark data to digital surface models. As a pre-procedure, we filter all the possible image pairs according to the incidence angle and capture date. With the selected image pairs, the relative bias-compensated model is applied for relative orientation. After the epipolar image pairs' generation, dense image matching and triangulation, the 3D point clouds and DSMs are acquired. The DSMs are aligned to a quasi-ground plane by the relative bias-compensated model. We apply the median filter to generate the fused point cloud and DSM. By comparing with the reference LiDAR DSM, the accuracy, the completeness and the robustness are evaluated. The results show, that the point cloud reconstructs the surface with small structures and the fused DSM generated by our pipeline is accurate and robust.
Little Boy replication: justification and construction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malenfant, R.E.
A reconstruction of the Little Boy weapon allowed experiments to evaluate yield, leakage measurements for comparison with calculations, and phenomenological measurements to evaluate various in-situ dosimeters. The reconstructed weapon was operated at sustained delayed critical at the Los Alamos Critical Assembly Facility. The present experiments provide a wealth of information to benchmark calculations and demonstrate that the 1965 measurements on the Ichiban assembly (a spherical mockup of Little Boy) were in error. 5 references, 2 figures.
Electric load shape benchmarking for small- and medium-sized commercial buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Xuan; Hong, Tianzhen; Chen, Yixing
Small- and medium-sized commercial buildings owners and utility managers often look for opportunities for energy cost savings through energy efficiency and energy waste minimization. However, they currently lack easy access to low-cost tools that help interpret the massive amount of data needed to improve understanding of their energy use behaviors. Benchmarking is one of the techniques used in energy audits to identify which buildings are priorities for an energy analysis. Traditional energy performance indicators, such as the energy use intensity (annual energy per unit of floor area), consider only the total annual energy consumption, lacking consideration of the fluctuation ofmore » energy use behavior over time, which reveals the time of use information and represents distinct energy use behaviors during different time spans. To fill the gap, this study developed a general statistical method using 24-hour electric load shape benchmarking to compare a building or business/tenant space against peers. Specifically, the study developed new forms of benchmarking metrics and data analysis methods to infer the energy performance of a building based on its load shape. We first performed a data experiment with collected smart meter data using over 2,000 small- and medium-sized businesses in California. We then conducted a cluster analysis of the source data, and determined and interpreted the load shape features and parameters with peer group analysis. Finally, we implemented the load shape benchmarking feature in an open-access web-based toolkit (the Commercial Building Energy Saver) to provide straightforward and practical recommendations to users. The analysis techniques were generic and flexible for future datasets of other building types and in other utility territories.« less
Electric load shape benchmarking for small- and medium-sized commercial buildings
Luo, Xuan; Hong, Tianzhen; Chen, Yixing; ...
2017-07-28
Small- and medium-sized commercial buildings owners and utility managers often look for opportunities for energy cost savings through energy efficiency and energy waste minimization. However, they currently lack easy access to low-cost tools that help interpret the massive amount of data needed to improve understanding of their energy use behaviors. Benchmarking is one of the techniques used in energy audits to identify which buildings are priorities for an energy analysis. Traditional energy performance indicators, such as the energy use intensity (annual energy per unit of floor area), consider only the total annual energy consumption, lacking consideration of the fluctuation ofmore » energy use behavior over time, which reveals the time of use information and represents distinct energy use behaviors during different time spans. To fill the gap, this study developed a general statistical method using 24-hour electric load shape benchmarking to compare a building or business/tenant space against peers. Specifically, the study developed new forms of benchmarking metrics and data analysis methods to infer the energy performance of a building based on its load shape. We first performed a data experiment with collected smart meter data using over 2,000 small- and medium-sized businesses in California. We then conducted a cluster analysis of the source data, and determined and interpreted the load shape features and parameters with peer group analysis. Finally, we implemented the load shape benchmarking feature in an open-access web-based toolkit (the Commercial Building Energy Saver) to provide straightforward and practical recommendations to users. The analysis techniques were generic and flexible for future datasets of other building types and in other utility territories.« less
PFLOTRAN-RepoTREND Source Term Comparison Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, Jennifer M.
Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.
High-Order Methods for Computational Physics
1999-03-01
computation is running in 278 Ronald D. Henderson parallel. Instead we use the concept of a voxel database (VDB) of geometric positions in the mesh [85...processor 0 Fig. 4.19. Connectivity and communications axe established by building a voxel database (VDB) of positions. A VDB maps each position to a...studies such as the highly accurate stability computations considered help expand the database for this benchmark problem. The two-dimensional linear
Mobility Research at TARDEC (Briefing Charts)
2015-03-10
UWM UIC UWM UWM Gap Collaboration 4 ARC & RIF Fund: $255k+$250K New ANCF shell element Fiber -reinforced composite rubber Validation and benchmark 2013...U.S. ARMY TANK AUTOMOTIVE RESEARCH, DEVELOPMENT AND ENGINEERING CENTER Mobility Research at TARDEC Dr. P. Jayakumar, S. Arepally Analytics 1...t s 5 9 - - - -3 t s 7 98 - - - . . . .t s Drucker-Prager Elasto- Plastic Soil Elastic Soil 6 A Physics-Based High Performance
Lifelong Transfer Learning for Heterogeneous Teams of Agents in Sequential Decision Processes
2016-06-01
making (SDM) tasks in dynamic environments with simulated and physical robots . 15. SUBJECT TERMS Sequential decision making, lifelong learning, transfer...sequential decision-making (SDM) tasks in dynamic environments with both simple benchmark tasks and more complex aerial and ground robot tasks. Our work...and ground robots in the presence of disturbances: We applied our methods to the problem of learning controllers for robots with novel disturbances in
What Randomized Benchmarking Actually Measures
Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; ...
2017-09-28
Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not amore » well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.« less
Hybrid parallel code acceleration methods in full-core reactor physics calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Courau, T.; Plagne, L.; Ponicot, A.
2012-07-01
When dealing with nuclear reactor calculation schemes, the need for three dimensional (3D) transport-based reference solutions is essential for both validation and optimization purposes. Considering a benchmark problem, this work investigates the potential of discrete ordinates (Sn) transport methods applied to 3D pressurized water reactor (PWR) full-core calculations. First, the benchmark problem is described. It involves a pin-by-pin description of a 3D PWR first core, and uses a 8-group cross-section library prepared with the DRAGON cell code. Then, a convergence analysis is performed using the PENTRAN parallel Sn Cartesian code. It discusses the spatial refinement and the associated angular quadraturemore » required to properly describe the problem physics. It also shows that initializing the Sn solution with the EDF SPN solver COCAGNE reduces the number of iterations required to converge by nearly a factor of 6. Using a best estimate model, PENTRAN results are then compared to multigroup Monte Carlo results obtained with the MCNP5 code. Good consistency is observed between the two methods (Sn and Monte Carlo), with discrepancies that are less than 25 pcm for the k{sub eff}, and less than 2.1% and 1.6% for the flux at the pin-cell level and for the pin-power distribution, respectively. (authors)« less