Geometry creation for MCNP by Sabrina and XSM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Riper, K.A.
The Monte Carlo N-Particle transport code MCNP is based on a surface description of 3-dimensional geometry. Cells are defined in terms of boolean operations on signed quadratic surfaces. MCNP geometry is entered as a card image file containing coefficients of the surface equations and a list of surfaces and operators describing cells. Several programs are available to assist in creation of the geometry specification, among them Sabrina and the new ``Smart Editor`` code XSM. We briefly describe geometry creation in Sabrina and then discuss XSM in detail. XSM is under development; our discussion is based on the state of XSMmore » as of January 1, 1994.« less
Using NJOY to Create MCNP ACE Files and Visualize Nuclear Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahler, Albert Comstock
We provide lecture materials that describe the input requirements to create various MCNP ACE files (Fast, Thermal, Dosimetry, Photo-nuclear and Photo-atomic) with the NJOY Nuclear Data Processing code system. Input instructions to visualize nuclear data with NJOY are also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendoza, Paul Michael
2016-08-31
The project goals seek to develop applications in order to automate MCNP criticality benchmark execution; create a dataset containing static benchmark information; combine MCNP output with benchmark information; and fit and visually represent data.
DeviceEditor visual biological CAD canvas
2012-01-01
Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390
A Visual Editor in Java for View
NASA Technical Reports Server (NTRS)
Stansifer, Ryan
2000-01-01
In this project we continued the development of a visual editor in the Java programming language to create screens on which to display real-time data. The data comes from the numerous systems monitoring the operation of the space shuttle while on the ground and in space, and from the many tests of subsystems. The data can be displayed on any computer platform running a Java-enabled World Wide Web (WWW) browser and connected to the Internet. Previously a special-purpose program bad been written to display data on emulations of character-based display screens used for many years at NASA. The goal now is to display bit-mapped screens created by a visual editor. We report here on the visual editor that creates the display screens. This project continues the work we bad done previously. Previously we had followed the design of the 'beanbox,' a prototype visual editor created by Sun Microsystems. We abandoned this approach and implemented a prototype using a more direct approach. In addition, our prototype is based on newly released Java 2 graphical user interface (GUI) libraries. The result has been a visually more appealing appearance and a more robust application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendoza, Paul Michael
The Monte Carlo N-Particle (MCNP) transport code developed at Los Alamos National Laboratory (LANL) utilizes nuclear cross-section data in a compact ENDF (ACE) format. The accuracy of MCNP calculations depends on the accuracy of nuclear ACE data tables, which depends on the accuracy of the original ENDF files. There are some noticeable differences in ENDF files from one generation to the next, even among the more common fissile materials. As the next generation of ENDF files is being prepared, several software tools were developed to simulate a large number of benchmarks in MCNP (over 1000), collect data from these simulations,more » and visually represent the results.« less
SABRINA - An interactive geometry modeler for MCNP (Monte Carlo Neutron Photon)
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T.; Murphy, J.
SABRINA is an interactive three-dimensional geometry modeler developed to produce complicated models for the Los Alamos Monte Carlo Neutron Photon program MCNP. SABRINA produces line drawings and color-shaded drawings for a wide variety of interactive graphics terminals. It is used as a geometry preprocessor in model development and as a Monte Carlo particle-track postprocessor in the visualization of complicated particle transport problem. SABRINA is written in Fortran 77 and is based on the Los Alamos Common Graphics System, CGS. 5 refs., 2 figs.
Extended Hu¨ckel Calculations on Solids Using the Avogadro Molecular Editor and Visualizer
ERIC Educational Resources Information Center
Avery, Patrick; Ludoweig, Herbert; Autschbach, Jochen; Zurek, Eva
2018-01-01
The "Yet Another extended Hu¨ckel Molecular Orbital Package" (YAeHMOP) has been merged with the Avogadro open-source molecular editor and visualizer. It is now possible to perform YAeHMOP calculations directly from the Avogadro graphical user interface for materials that are periodic in one, two, or three dimensions, and to visualize…
MRAPs, Irregular Warfare, and Pentagon Reform
2009-06-01
U P R E S S S TA F F COLONEL DAVID H. GURNEY, USMC (RET.) Director DR. JEFFREY D. SMOTHERMAN Executive Editor GEORGE C. MAERZ Supervisory Editor...LISA M. YAMBRICK Writer-Editor CALVIN B. KELLEY Writer-Editor MARTIN J. PETERS Production Supervisor TARA J. PAREKH Visual Design Editor O T H E R T I T ...L E S I N T H E S E R I E S Choosing War: The Decision to Invade Iraq and Its Aftermath Occasional Paper 5, April 2008 China’s Global Activism
Distributed visualization framework architecture
NASA Astrophysics Data System (ADS)
Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger
2010-01-01
An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this interface. One of the main features is an interactive shader designer. This allows rapid prototyping of new visualization renderings that are shader-based and greatly accelerates the development and debug cycle.
Proceedings-1979 third annual practical conference on communication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-04-01
Topics covered at the meeting include: nonacademic writing, writer and editor training in technical publications, readability of technical documents, guide for beginning technical editors, a visual aids data base, newsletter publishing, style guide for a project management organization, word processing, computer graphics, text management for technical documentation, and typographical terminology.
Verification of MCNP6.2 for Nuclear Criticality Safety Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2017-05-10
Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suitesmore » were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rolison, L; Samant, S; Baciak, J
Purpose: To develop a Monte Carlo N-Particle (MCNP) model for the validation of a prototype backscatter x-ray (BSX) imager, and optimization of BSX technology for medical applications, including selective object-plane imaging. Methods: BSX is an emerging technology that represents an alternative to conventional computed tomography (CT) and projective digital radiography (DR). It employs detectors located on the same side as the incident x-ray source, making use of backscatter and avoiding ring geometry to enclose the imaging object. Current BSX imagers suffer from low spatial resolution. A MCNP model was designed to replicate a BSX prototype used for flaw detection inmore » industrial materials. This prototype consisted of a 1.5mm diameter 60kVp pencil beam surrounded by a ring of four 5.0cm diameter NaI scintillation detectors. The imaging phantom consisted of a 2.9cm thick aluminum plate with five 0.6cm diameter holes drilled halfway. The experimental image was created using a raster scanning motion (in 1.5mm increments). Results: A qualitative comparison between the physical and simulated images showed very good agreement with 1.5mm spatial resolution in plane perpendicular to incident x-ray beam. The MCNP model developed the concept of radiography by selective plane detection (RSPD) for BSX, whereby specific object planes can be imaged by varying kVp. 10keV increments in mean x-ray energy yielded 4mm thick slice resolution in the phantom. Image resolution in the MCNP model can be further increased by increasing the number of detectors, and decreasing raster step size. Conclusion: MCNP modelling was used to validate a prototype BSX imager and introduce the RSPD concept, allowing for selective object-plane imaging. There was very good visual agreement between the experimental and MCNP imaging. Beyond optimizing system parameters for the existing prototype, new geometries can be investigated for volumetric image acquisition in medical applications. This material is based upon work supported under an Integrated University Program Graduate Fellowship sponsored by the Department of Energy Office of Nuclear Energy.« less
CMS Configuration Editor: GUI based application for user analysis job
NASA Astrophysics Data System (ADS)
de Cosa, A.
2011-12-01
We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.
Testing of ENDF71x: A new ACE-formatted neutron data library based on ENDF/B-VII.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardiner, S. J.; Conlin, J. L.; Kiedrowski, B. C.
The ENDF71x library [1] is the most thoroughly tested set of ACE-format data tables ever released by the Nuclear Data Team at Los Alamos National Laboratory (LANL). It is based on ENDF/B-VII. 1, the most recently released set of evaluated nuclear data files produced by the US Cross Section Evaluation Working Group (CSEWG). A variety of techniques were used to test and verify the ENDF7 1x library before its public release. These include the use of automated checking codes written by members of the Nuclear Data Team, visual inspections of key neutron data, MCNP6 calculations designed to test data formore » every included combination of isotope and temperature as comprehensively as possible, and direct comparisons between ENDF71x and previous ACE library releases. Visual inspection of some of the most important neutron data revealed energy balance problems and unphysical discontinuities in the cross sections for some nuclides. Doppler broadening of the total cross sections with increasing temperature was found to be qualitatively correct. Test calculations performed using MCNP prompted two modifications to the MCNP6 source code and also exposed bad secondary neutron yields for {sup 231,233}Pa that are present in both ENDF/B-VII.1 and ENDF/B-VII.0. A comparison of ENDF71x with its predecessor ACE library, ENDF70, showed that dramatic changes have been made in the neutron cross section data for a number of isotopes between ENDF/B-VII.0 and ENDF/B-VII.1. Based on the results of these verification tests and the validation tests performed by Kahler, et al. [2], the ENDF71x library is recommended for use in all Monte Carlo applications. (authors)« less
Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao
2018-05-17
To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ducasse, J.; Macé, M.; Jouffrais, C.
2015-08-01
Visual maps must be transcribed into (interactive) raised-line maps to be accessible for visually impaired people. However, these tactile maps suffer from several shortcomings: they are long and expensive to produce, they cannot display a large amount of information, and they are not dynamically modifiable. A number of methods have been developed to automate the production of raised-line maps, but there is not yet any tactile map editor on the market. Tangible interactions proved to be an efficient way to help a visually impaired user manipulate spatial representations. Contrary to raised-line maps, tangible maps can be autonomously constructed and edited. In this paper, we present the scenarios and the main expected contributions of the AccessiMap project, which is based on the availability of many sources of open spatial data: 1/ facilitating the production of interactive tactile maps with the development of an open-source web-based editor; 2/ investigating the use of tangible interfaces for the autonomous construction and exploration of a map by a visually impaired user.
NASA Astrophysics Data System (ADS)
Metzger, Robert; Riper, Kenneth Van; Lasche, George
2017-09-01
A new method for analysis of uranium and radium in soils by gamma spectroscopy has been developed using VRF ("Visual RobFit") which, unlike traditional peak-search techniques, fits full-spectrum nuclide shapes with non-linear least-squares minimization of the chi-squared statistic. Gamma efficiency curves were developed for a 500 mL Marinelli beaker geometry as a function of soil density using MCNP. Collected spectra were then analyzed using the MCNP-generated efficiency curves and VRF to deconvolute the 90 keV peak complex of uranium and obtain 238U and 235U activities. 226Ra activity was determined either from the radon daughters if the equilibrium status is known, or directly from the deconvoluted 186 keV line. 228Ra values were determined from the 228Ac daughter activity. The method was validated by analysis of radium, thorium and uranium soil standards and by inter-comparison with other methods for radium in soils. The method allows for a rapid determination of whether a sample has been impacted by a man-made activity by comparison of the uranium and radium concentrations to those that would be expected from a natural equilibrium state.
Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian
2013-08-21
The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.
Activity Scratchpad Prototype: Simplifying the Rover Activity Planning Cycle
NASA Technical Reports Server (NTRS)
Abramyan, Lucy
2005-01-01
The Mars Exploration Rover mission depends on the Science Activity Planner as its primary interface to the Spirit and Opportunity Rovers. Scientists alternate between a series of mouse clicks and keyboard inputs to create a set of instructions for the rovers. To accelerate planning by minimizing mouse usage, a rover planning editor should receive the majority of inputted commands from the keyboard. Thorough investigation of the Eclipse platform's Java editor has provided the understanding of the base model for the Activity Scratchpad. Desirable Eclipse features can be mapped to specific rover planning commands, such as auto-completion for activity titles and content assist for target names. A custom editor imitating the Java editor's features was created with an XML parser for experimenting purposes. The prototype editor minimized effort for redundant tasks and significantly improved the visual representation of XML syntax by highlighting keywords, coloring rules, folding projections, and providing hover assist, templates and an outline view of the code.
MCNP Output Data Analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-06-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
2011-03-01
functions of the vignette editor include visualizing the state of the UAS team, creating T&E scenarios, monitoring the UAS team performance, and...These behaviors are then executed by the robot sequentially (Figure 2). A state machine mission editor allows mission builders to use behaviors from the...include control, robotics, distributed applications, multimedia applications, databases, design patterns, and software engineering. Mr. Lenzi is the
PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blakeman, Edward D; Peplow, Douglas E.; Wagner, John C
2007-09-01
The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally filesmore » and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zehtabian, M; Zaker, N; Sina, S
2015-06-15
Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less
NASA Technical Reports Server (NTRS)
Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.
1993-01-01
Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.
Possible Improvements to MCNP6 and its CEM/LAQGSM Event-Generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan Georgievich
2015-08-04
This report is intended to the MCNP6 developers and sponsors of MCNP6. It presents a set of suggested possible future improvements to MCNP6 and to its CEM03.03 and LAQGSM03.03 event-generators. A few suggested modifications of MCNP6 are quite simple, aimed at avoiding possible problems with running MCNP6 on various computers, i.e., these changes are not expected to change or improve any results, but should make the use of MCNP6 easier; such changes are expected to require limited man-power resources. On the other hand, several other suggested improvements require a serious further development of nuclear reaction models, are expected to improvemore » significantly the predictive power of MCNP6 for a number of nuclear reactions; but, such developments require several years of work by real experts on nuclear reactions.« less
Elaborate SMART MCNP Modelling Using ANSYS and Its Applications
NASA Astrophysics Data System (ADS)
Song, Jaehoon; Surh, Han-bum; Kim, Seung-jin; Koo, Bonsueng
2017-09-01
An MCNP 3-dimensional model can be widely used to evaluate various design parameters such as a core design or shielding design. Conventionally, a simplified 3-dimensional MCNP model is applied to calculate these parameters because of the cumbersomeness of modelling by hand. ANSYS has a function for converting the CAD `stp' format into an MCNP input in the geometry part. Using ANSYS and a 3- dimensional CAD file, a very detailed and sophisticated MCNP 3-dimensional model can be generated. The MCNP model is applied to evaluate the assembly weighting factor at the ex-core detector of SMART, and the result is compared with a simplified MCNP SMART model and assembly weighting factor calculated by DORT, which is a deterministic Sn code.
MCNP Version 6.2 Release Notes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.
Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less
Jim Thomas: A Collection of Memories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.
Jim Thomas, a guest editor and a long-time associate editor of Information Visualization (IVS), died in Richland, WA, on August 6, 2010 due to complications from a brain tumor. His friends and colleagues from around the world have since expressed their sadness and paid tribute to a visionary scientist in multiple public forums. For those who didn't get the chance to know Jim, I share a collection of my own memories of Jim Thomas and memories from some of his colleagues.
The Perception of Multiple Images
ERIC Educational Resources Information Center
Goldstein, E. Bruce
1975-01-01
A discussion of visual field, foveal and peripheral vision, eye fixations, recognition and recall of pictures, memory for meaning of pictures, and the relation between speed of presentation and memory. (Editor)
Phrase Units as Determinants of Visual Processing in Music Reading
ERIC Educational Resources Information Center
Sloboda, John A.
1977-01-01
Keyboard musicians sight-read passages of music in which the amount of information about the presence of phrase units was systematically varied. Results suggest a clear analogy between the cognition of music and language, in that knowledge of abstract structure is of importance in the organization of immediate visual processing of text. (Editor/RK)
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
2014-01-01
and 50 kT, to within 30% of first-principles code ( MCNP ) for complicated cities and 10% for simpler cities. 15. SUBJECT TERMS Radiation Transport...Use of MCNP for Dose Calculations .................................................................... 3 2.3 MCNP Open-Field Absorbed Dose...Calculations .................................................. 4 2.4 The MCNP Urban Model
Comparisons between MCNP, EGS4 and experiment for clinical electron beams.
Jeraj, R; Keall, P J; Ostwald, P M
1999-03-01
Understanding the limitations of Monte Carlo codes is essential in order to avoid systematic errors in simulations, and to suggest further improvement of the codes. MCNP and EGS4, Monte Carlo codes commonly used in medical physics, were compared and evaluated against electron depth dose data and experimental backscatter results obtained using clinical radiotherapy beams. Different physical models and algorithms used in the codes give significantly different depth dose curves and electron backscattering factors. The default version of MCNP calculates electron depth dose curves which are too penetrating. The MCNP results agree better with experiment if the ITS-style energy-indexing algorithm is used. EGS4 underpredicts electron backscattering for high-Z materials. The results slightly improve if optimal PRESTA-I parameters are used. MCNP simulates backscattering well even for high-Z materials. To conclude the comparison, a timing study was performed. EGS4 is generally faster than MCNP and use of a large number of scoring voxels dramatically slows down the MCNP calculation. However, use of a large number of geometry voxels in MCNP only slightly affects the speed of the calculation.
IViPP: A Tool for Visualization in Particle Physics
NASA Astrophysics Data System (ADS)
Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug
2011-10-01
Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.
Review of heavy charged particle transport in MCNP6.2
NASA Astrophysics Data System (ADS)
Zieb, K.; Hughes, H. G.; James, M. R.; Xu, X. G.
2018-04-01
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. This paper discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models' theories are included as well.
Review of Heavy Charged Particle Transport in MCNP6.2
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George; ...
2018-01-05
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
2013-07-01
also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32 B. MCNP PHYSICS OPTIONS ......................................................................................... 33 C. HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon
Performance of MCNP4A on seven computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.S.; Brockhoff, R.C.
1994-12-31
The performance of seven computer platforms has been evaluated with the MCNP4A Monte Carlo radiation transport code. For the first time we report timing results using MCNP4A and its new test set and libraries. Comparisons are made on platforms not available to us in previous MCNP timing studies. By using MCNP4A and its 325-problem test set, a widely-used and readily-available physics production code is used; the timing comparison is not limited to a single ``typical`` problem, demonstrating the problem dependence of timing results; the results are reproducible at the more than 100 installations around the world using MCNP; comparison ofmore » performance of other computer platforms to the ones tested in this study is possible because we present raw data rather than normalized results; and a measure of the increase in performance of computer hardware and software over the past two years is possible. The computer platforms reported are the Cray-YMP 8/64, IBM RS/6000-560, Sun Sparc10, Sun Sparc2, HP/9000-735, 4 processor 100 MHz Silicon Graphics ONYX, and Gateway 2000 model 4DX2-66V PC. In 1991 a timing study of MCNP4, the predecessor to MCNP4A, was conducted using ENDF/B-V cross-section libraries, which are export protected. The new study is based upon the new MCNP 25-problem test set which utilizes internationally available data. MCNP4A, its test problems and the test data library are available from the Radiation Shielding and Information Center in Oak Ridge, Tennessee, or from the NEA Data Bank in Saclay, France. Anyone with the same workstation and compiler can get the same test problem sets, the same library files, and the same MCNP4A code from RSIC or NEA and replicate our results. And, because we report raw data, comparison of the performance of other compute platforms and compilers can be made.« less
V&V of MCNP 6.1.1 Beta Against Intermediate and High-Energy Experimental Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan G
This report presents a set of validation and verification (V&V) MCNP 6.1.1 beta results calculated in parallel, with MPI, obtained using its event generators at intermediate and high-energies compared against various experimental data. It also contains several examples of results using the models at energies below 150 MeV, down to 10 MeV, where data libraries are normally used. This report can be considered as the forth part of a set of MCNP6 Testing Primers, after its first, LA-UR-11-05129, and second, LA-UR-11-05627, and third, LA-UR-26944, publications, but is devoted to V&V with the latest, 1.1 beta version of MCNP6. The MCNP6more » test-problems discussed here are presented in the /VALIDATION_CEM/and/VALIDATION_LAQGSM/subdirectories in the MCNP6/Testing/directory. README files that contain short descriptions of every input file, the experiment, the quantity of interest that the experiment measures and its description in the MCNP6 output files, and the publication reference of that experiment are presented for every test problem. Templates for plotting the corresponding results with xmgrace as well as pdf files with figures representing the final results of our V&V efforts are presented. Several technical “bugs” in MCNP 6.1.1 beta were discovered during our current V&V of MCNP6 while running it in parallel with MPI using its event generators. These “bugs” are to be fixed in the following version of MCNP6. Our results show that MCNP 6.1.1 beta using its CEM03.03, LAQGSM03.03, Bertini, and INCL+ABLA, event generators describes, as a rule, reasonably well different intermediate- and high-energy measured data. This primer isn’t meant to be read from cover to cover. Readers may skip some sections and go directly to any test problem in which they are interested.« less
ERIC Educational Resources Information Center
Jones, Clive
1972-01-01
A look at the latest package from a British managment training organization, which explains and demonstrates creative thinking techniques, including brainstorming. The package, designed for groups of twelve or more, consists of tapes, visuals, and associated exercises. (Editor/JB)
MCNP4A: Features and philosophy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.S.
This paper describes MCNP, states its philosophy, introduces a number of new features becoming available with version MCNP4A, and answers a number of questions asked by participants in the workshop. MCNP is a general-purpose three-dimensional neutron, photon and electron transport code. Its philosophy is ``Quality, Value and New Features.`` Quality is exemplified by new software quality assurance practices and a program of benchmarking against experiments. Value includes a strong emphasis on documentation and code portability. New features are the third priority. MCNP4A is now available at Los Alamos. New features in MCNP4A include enhanced statistical analysis, distributed processor multitasking, newmore » photon libraries, ENDF/B-VI capabilities, X-Windows graphics, dynamic memory allocation, expanded criticality output, periodic boundaries, plotting of particle tracks via SABRINA, and many other improvements. 23 refs.« less
Communications Effects Server (CES) Model for Systems Engineering Research
2012-01-31
Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army
ERIC Educational Resources Information Center
2002
The Visual Communication Division of the proceedings contains the following 7 papers: "Photography Editors as Gatekeepers: Choosing Between Publishing or Self-Censoring Disturbing Images of 9-11" (Renee Martin Kratzer and Brian Kratzer); "Jane Campion's 'The Piano': The Female Gaze, the Speculum and the Chora within the…
Programming the Navier-Stokes computer: An abstract machine model and a visual editor
NASA Technical Reports Server (NTRS)
Middleton, David; Crockett, Tom; Tomboulian, Sherry
1988-01-01
The Navier-Stokes computer is a parallel computer designed to solve Computational Fluid Dynamics problems. Each processor contains several floating point units which can be configured under program control to implement a vector pipeline with several inputs and outputs. Since the development of an effective compiler for this computer appears to be very difficult, machine level programming seems necessary and support tools for this process have been studied. These support tools are organized into a graphical program editor. A programming process is described by which appropriate computations may be efficiently implemented on the Navier-Stokes computer. The graphical editor would support this programming process, verifying various programmer choices for correctness and deducing values such as pipeline delays and network configurations. Step by step details are provided and demonstrated with two example programs.
NASA Astrophysics Data System (ADS)
Craig, Paul; Kennedy, Jessie
2008-01-01
An increasingly common approach being taken by taxonomists to define the relationships between taxa in alternative hierarchical classifications is to use a set-based notation which states relationship between two taxa from alternative classifications. Textual recording of these relationships is cumbersome and difficult for taxonomists to manage. While text based GUI tools are beginning to appear which ease the process, these have several limitations. Interactive visual tools offer greater potential to allow taxonomists to explore the taxa in these hierarchies and specify such relationships. This paper describes the Concept Relationship Editor, an interactive visualisation tool designed to support the assertion of relationships between taxonomic classifications. The tool operates using an interactive space-filling adjacency layout which allows users to expand multiple lists of taxa with common parents so they can explore and assert relationships between two classifications.
ShelXle: a Qt graphical user interface for SHELXL.
Hübschle, Christian B; Sheldrick, George M; Dittrich, Birger
2011-12-01
ShelXle is a graphical user interface for SHELXL [Sheldrick, G. M. (2008). Acta Cryst. A64, 112-122], currently the most widely used program for small-molecule structure refinement. It combines an editor with syntax highlighting for the SHELXL-associated .ins (input) and .res (output) files with an interactive graphical display for visualization of a three-dimensional structure including the electron density (F(o)) and difference density (F(o)-F(c)) maps. Special features of ShelXle include intuitive atom (re-)naming, a strongly coupled editor, structure visualization in various mono and stereo modes, and a novel way of displaying disorder extending over special positions. ShelXle is completely compatible with all features of SHELXL and is written entirely in C++ using the Qt4 and FFTW libraries. It is available at no cost for Windows, Linux and Mac-OS X and as source code.
MCNP6 Fission Multiplicity with FMULT Card
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilcox, Trevor; Fensin, Michael Lorne; Hendricks, John S.
With the merger of MCNPX and MCNP5 into MCNP6, MCNP6 now provides all the capabilities of both codes allowing the user to access all the fission multiplicity data sets. Detailed in this paper is: (1) the new FMULT card capabilities for accessing these different data sets; (2) benchmark calculations, as compared to experiment, detailing the results of selecting these separate data sets for thermal neutron induced fission on U-235.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation [1-3]. It uses the sensitivity profile data for an application as computed by MCNP6 [4-6] along with covariance files [7,8] for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014 [3]. During 2015-2016, Whisper was updated to version 1.1 [9] and is to be included with the upcoming release of MCNP6.2. This document describes the Whisper-1.1 package that will be included with the MCNP6.2 release during 2017. Specific detailsmore » are provided on the computer systems supported, the software quality assurance (SQA) procedures, installation, and testing. This document does not address other important topics, such as the importance of sensitivity-uncertainty (SU) methods to NCS validation, the theory underlying SU methodology, tutorials on the usage of MCNP-Whisper, practical approaches to using SU methodology to support and extend traditional validation, etc. There are over 50 documents included with Whisper-1.1 and available in the MCNP Reference Collection on the MCNP website (mcnp.lanl.gov) that address all of those topics and more. In this document, however, a complete bibliography of relevant MCNP-Whisper references is provided.« less
Performance upgrades to the MCNP6 burnup capability for large scale depletion calculations
Fensin, M. L.; Galloway, J. D.; James, M. R.
2015-04-11
The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. With the merger of MCNPX and MCNP5, MCNP6 combined the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. The new MCNP6 depletion capability was first showcased at the International Congress for Advancements in Nuclear Power Plants (ICAPP) meeting in 2012. At that conference the new capabilities addressed included the combined distributive and shared memory parallel architecture for the burnup capability, improved memory management, physics enhancements, and newmore » predictability as compared to the H.B Robinson Benchmark. At Los Alamos National Laboratory, a special purpose cluster named “tebow,” was constructed such to maximize available RAM per CPU, as well as leveraging swap space with solid state hard drives, to allow larger scale depletion calculations (allowing for significantly more burnable regions than previously examined). As the MCNP6 burnup capability was scaled to larger numbers of burnable regions, a noticeable slowdown was realized.This paper details two specific computational performance strategies for improving calculation speedup: (1) retrieving cross sections during transport; and (2) tallying mechanisms specific to burnup in MCNP. To combat this slowdown new performance upgrades were developed and integrated into MCNP6 1.2.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-08-01
Version 00 COG LibMaker contains various utilities to convert common data formats into a format usable by the COG - Multi-particle Monte Carlo Code System package, (C00777MNYCP01). Utilities included: ACEtoCOG - ACE formatted neutron data: Currently ENDFB7R0.BNL, ENDFB7R1.BNL, JEFF3.1, JEFF3.1.1, JEFF3.1.2, MCNP.50c, MCNP.51c, MCNP.55c, MCNP.66c, and MCNP.70c. ACEUtoCOG - ACEU formatted photonuclear data: Currently PN.MCNP.30c and PN.MCNP.70u. ACTLtoCOG - Creates a COG library from ENDL formatted activation data COG library. EDDLtoCOG - Creates a COG library from ENDL formatted LLNL deuteron data. ENDLtoCOG - Creates a COG library from ENDL formatted LLNL neutron data. EPDLtoCOG - Creates a COG librarymore » from ENDL formatted LLNL photon data. LEX - Creates a COG dictionary file. SAB.ACEtoCOG - Creates a COG library from ACE formatted S(a,b) data. SABtoCOG - Creates a COG library from ENDF6 formatted S(a,b) data. URRtoCOG - Creates a COG library from ACE formatted probability table data. This package also includes library checking and bit swapping capability.« less
Cai, Yao; Hu, Huasi; Lu, Shuangying; Jia, Qinggang
2018-05-01
To minimize the size and weight of a vehicle-mounted accelerator-driven D-T neutron source and protect workers from unnecessary irradiation after the equipment shutdown, a method to optimize radiation shielding material aiming at compactness, lightweight, and low activation for the fast neutrons was developed. The method employed genetic algorithm, combining MCNP and ORIGEN codes. A series of composite shielding material samples were obtained by the method step by step. The volume and weight needed to build a shield (assumed as a coaxial tapered cylinder) were adopted to compare the performance of the materials visually and conveniently. The results showed that the optimized materials have excellent performance in comparison with the conventional materials. The "MCNP6-ACT" method and the "rigorous two steps" (R2S) method were used to verify the activation grade of the shield irradiated by D-T neutrons. The types of radionuclide, the energy spectrum of corresponding decay gamma source, and the variation in decay gamma dose rate were also computed. Copyright © 2018 Elsevier Ltd. All rights reserved.
MCNP HPGe detector benchmark with previously validated Cyltran model.
Hau, I D; Russ, W R; Bronson, F
2009-05-01
An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.
Delta-ray Production in MCNP 6.2.0
NASA Astrophysics Data System (ADS)
Anderson, C.; McKinney, G.; Tutt, J.; James, M.
Secondary electrons in the form of delta-rays, also referred to as knock-on electrons, have been a feature of MCNP for electron and positron transport for over 20 years. While MCNP6 now includes transport for a suite of heavy-ions and charged particles from its integration with MCNPX, the production of delta-rays was still limited to electron and positron transport. In the newest release of MCNP6, version 6.2.0, delta-ray production has now been extended for all energetic charged particles. The basis of this production is the analytical formulation from Rossi and ICRU Report 37. This paper discusses the MCNP6 heavy charged-particle implementation and provides production results for several benchmark/test problems.
AN ASSESSMENT OF MCNP WEIGHT WINDOWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. S. HENDRICKS; C. N. CULBERTSON
2000-01-01
The weight window variance reduction method in the general-purpose Monte Carlo N-Particle radiation transport code MCNPTM has recently been rewritten. In particular, it is now possible to generate weight window importance functions on a superimposed mesh, eliminating the need to subdivide geometries for variance reduction purposes. Our assessment addresses the following questions: (1) Does the new MCNP4C treatment utilize weight windows as well as the former MCNP4B treatment? (2) Does the new MCNP4C weight window generator generate importance functions as well as MCNP4B? (3) How do superimposed mesh weight windows compare to cell-based weight windows? (4) What are the shortcomingsmore » of the new MCNP4C weight window generator? Our assessment was carried out with five neutron and photon shielding problems chosen for their demanding variance reduction requirements. The problems were an oil well logging problem, the Oak Ridge fusion shielding benchmark problem, a photon skyshine problem, an air-over-ground problem, and a sample problem for variance reduction.« less
Benchmarking comparison and validation of MCNP photon interaction data
NASA Astrophysics Data System (ADS)
Colling, Bethany; Kodeli, I.; Lilley, S.; Packer, L. W.
2017-09-01
The objective of the research was to test available photoatomic data libraries for fusion relevant applications, comparing against experimental and computational neutronics benchmarks. Photon flux and heating was compared using the photon interaction data libraries (mcplib 04p, 05t, 84p and 12p). Suitable benchmark experiments (iron and water) were selected from the SINBAD database and analysed to compare experimental values with MCNP calculations using mcplib 04p, 84p and 12p. In both the computational and experimental comparisons, the majority of results with the 04p, 84p and 12p photon data libraries were within 1σ of the mean MCNP statistical uncertainty. Larger differences were observed when comparing computational results with the 05t test photon library. The Doppler broadening sampling bug in MCNP-5 is shown to be corrected for fusion relevant problems through use of the 84p photon data library. The recommended libraries for fusion neutronics are 84p (or 04p) with MCNP6 and 84p if using MCNP-5.
MCNP capabilities for nuclear well logging calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forster, R.A.; Little, R.C.; Briesmeister, J.F.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, Timothy P.; Martz, Roger L.; Kiedrowski, Brian C.
New unstructured mesh capabilities in MCNP6 (developmental version during summer 2012) show potential for conducting multi-physics analyses by coupling MCNP to a finite element solver such as Abaqus/CAE[2]. Before these new capabilities can be utilized, the ability of MCNP to accurately estimate eigenvalues and pin powers using an unstructured mesh must first be verified. Previous work to verify the unstructured mesh capabilities in MCNP was accomplished using the Godiva sphere [1], and this work attempts to build on that. To accomplish this, a criticality benchmark and a fuel assembly benchmark were used for calculations in MCNP using both the Constructivemore » Solid Geometry (CSG) native to MCNP and the unstructured mesh geometry generated using Abaqus/CAE. The Big Ten criticality benchmark [3] was modeled due to its geometry being similar to that of a reactor fuel pin. The C5G7 3-D Mixed Oxide (MOX) Fuel Assembly Benchmark [4] was modeled to test the unstructured mesh capabilities on a reactor-type problem.« less
Factors to Consider When Designing Television Pictorials
ERIC Educational Resources Information Center
Trohanis, Pascal; Du Monceau, Michael
1971-01-01
The authors have developed a framework for improving the visual communication element of television. After warning that seeing is not enough to insure learning they discuss the five pre-production components which research indicates should be considered when designing television pictorials." (Editor)
Status Report on the MCNP 2020 Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan
2017-10-02
The discussion below provides a status report on the MCNP 2020 initiative. It includes discussion of the history of MCNP 2020, accomplishments during 2013-17, priorities for near-term development, other related efforts, a brief summary, and a list of references for the plans and work accomplished.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Jeffrey S.
This presentation describes how to build MCNP 6.2. MCNP®* 6.2 can be compiled on Macs, PCs, and most Linux systems. It can also be built for parallel execution using both OpenMP and Messing Passing Interface (MPI) methods. MCNP6 requires Fortran, C, and C++ compilers to build the code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell
In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.
Delta-ray Production in MCNP 6.2.0
Anderson, Casey Alan; McKinney, Gregg Walter; Tutt, James Robert; ...
2017-10-26
Secondary electrons in the form of delta-rays, also referred to as knock-on electrons, have been a feature of MCNP for electron and positron transport for over 20 years. While MCNP6 now includes transport for a suite of heavy-ions and charged particles from its integration with MCNPX, the production of delta-rays was still limited to electron and positron transport. In the newest release of MCNP6, version 6.2.0, delta-ray production has now been extended for all energetic charged particles. The basis of this production is the analytical formulation from Rossi and ICRU Report 37. As a result, this paper discusses the MCNP6more » heavy charged-particle implementation and provides production results for several benchmark/test problems.« less
MCNP6 Simulation of Light and Medium Nuclei Fragmentation at Intermediate Energies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan Georgievich; Kerby, Leslie Marie
2015-05-22
MCNP6, the latest and most advanced LANL Monte Carlo transport code, representing a merger of MCNP5 and MCNPX, is actually much more than the sum of those two computer codes; MCNP6 is available to the public via RSICC at Oak Ridge, TN, USA. In the present work, MCNP6 was validated and verified (V&V) against different experimental data on intermediate-energy fragmentation reactions, and results by several other codes, using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.03 and LAQGSM03.03. It was found that MCNP6 usingmore » CEM03.03 and LAQGSM03.03 describes well fragmentation reactions induced on light and medium target nuclei by protons and light nuclei of energies around 1 GeV/nucleon and below, and can serve as a reliable simulation tool for different applications, like cosmic-ray-induced single event upsets (SEU’s), radiation protection, and cancer therapy with proton and ion beams, to name just a few. Future improvements of the predicting capabilities of MCNP6 for such reactions are possible, and are discussed in this work.« less
Modeling Sodium Iodide Detector Response Using Parametric Equations
2013-03-22
MCNP particle current and pulse height tally functions, backscattering photons are quantified as a function of material thickness and energy...source – detector – scattering medium arrangements were modeled in MCNP using the pulse height tally functions, integrated over a 70 keV – 360 keV energy...15 4.1 MCNP
MCNP-model for the OAEP Thai Research Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallmeier, F.X.; Tang, J.S.; Primm, R.T. III
An MCNP input was prepared for the Thai Research Reactor, making extensive use of the MCNP geometry`s lattice feature that allows a flexible and easy rearrangement of the core components and the adjustment of the control elements. The geometry was checked for overdefined or undefined zones by two-dimensional plots of cuts through the core configuration with the MCNP geometry plotting capabilities, and by a three-dimensional view of the core configuration with the SABRINA code. Cross sections were defined for a hypothetical core of 67 standard fuel elements and 38 low-enriched uranium fuel elements--all filled with fresh fuel. Three test calculationsmore » were performed with the MCNP4B-code to obtain the multiplication factor for the cases with control elements fully inserted, fully withdrawn, and at a working position.« less
Memory Effects in Visual Spatial Information Processing.
ERIC Educational Resources Information Center
Fishbein, Harold D.
1978-01-01
Eight, ten, and twelve year old children were tested on a novel procedure involving the successive presentation of standard and comparision stimuli. Two hypotheses were evaluated: one dealing with memory effects, and the other with children's pretesting of choice responses in spatial information processing. (Editor/RK)
NASA Astrophysics Data System (ADS)
Tayama, Ryuichi; Wakasugi, Kenichi; Kawanaka, Ikunori; Kadota, Yoshinobu; Murakami, Yasuhiro
We measured the skyshine dose from turbine buildings at Shimane Nuclear Power Station Unit 1 (NS-1) and Unit 2 (NS-2), and then compared it with the dose calculated with the Monte Carlo transport code MCNP5. The skyshine dose values calculated with the MCNP5 code agreed with the experimental data within a factor of 2.8, when the roof of the turbine building was precisely modeled. We concluded that our MCNP5 calculation was valid for BWR turbine skyshine dose evaluation.
NASA Astrophysics Data System (ADS)
Zhang, Lei; Jia, Mingchun; Gong, Junjun; Xia, Wenming
2017-08-01
The linear attenuation coefficient, mass attenuation coefficient and mean free path of various Lead-Boron Polyethylene (PbBPE) samples which can be used as the photon shielding materials in marine reactor have been simulated using the Monte Carlo N-Particle (MCNP)-5 code. The MCNP simulation results are in good agreement with the XCOM values and the reported experimental data for source Cesium-137 and Cobalt-60. Thus, this method based on MCNP can be used to simulate the photon attenuation characteristics of various types of PbBPE materials.
Utopia documents: linking scholarly literature with research data
Attwood, T. K.; Kell, D. B.; McDermott, P.; Marsh, J.; Pettifer, S. R.; Thorne, D.
2010-01-01
Motivation: In recent years, the gulf between the mass of accumulating-research data and the massive literature describing and analyzing those data has widened. The need for intelligent tools to bridge this gap, to rescue the knowledge being systematically isolated in literature and data silos, is now widely acknowledged. Results: To this end, we have developed Utopia Documents, a novel PDF reader that semantically integrates visualization and data-analysis tools with published research articles. In a successful pilot with editors of the Biochemical Journal (BJ), the system has been used to transform static document features into objects that can be linked, annotated, visualized and analyzed interactively (http://www.biochemj.org/bj/424/3/). Utopia Documents is now used routinely by BJ editors to mark up article content prior to publication. Recent additions include integration of various text-mining and biodatabase plugins, demonstrating the system's ability to seamlessly integrate on-line content with PDF articles. Availability: http://getutopia.com Contact: teresa.k.attwood@manchester.ac.uk PMID:20823323
NASA Astrophysics Data System (ADS)
Jung, Seongmoon; Sung, Wonmo; Lee, Jaegi; Ye, Sung-Joon
2018-01-01
Emerging radiological applications of gold nanoparticles demand low-energy electron/photon transport calculations including details of an atomic relaxation process. Recently, MCNP® version 6.1 (MCNP6.1) has been released with extended cross-sections for low-energy electron/photon, subshell photoelectric cross-sections, and more detailed atomic relaxation data than the previous versions. With this new feature, the atomic relaxation process of MCNP6.1 has not been fully tested yet with its new physics library (eprdata12) that is based on the Evaluated Atomic Data Library (EADL). In this study, MCNP6.1 was compared with GATEv7.2, PENELOPE2014, and EGSnrc that have been often used to simulate low-energy atomic relaxation processes. The simulations were performed to acquire both photon and electron spectra produced by interactions of 15 keV electrons or photons with a 10-nm-thick gold nano-slab. The photon-induced fluorescence X-rays from MCNP6.1 fairly agreed with those from GATEv7.2 and PENELOPE2014, while the electron-induced fluorescence X-rays of the four codes showed more or less discrepancies. A coincidence was observed in the photon-induced Auger electrons simulated by MCNP6.1 and GATEv7.2. A recent release of MCNP6.1 with eprdata12 can be used to simulate the photon-induced atomic relaxation.
Thermal Neutron Point Source Imaging using a Rotating Modulation Collimator (RMC)
2010-03-01
Source Details.........................................................................................37 3.5 Simulation of RMC in MCNP ...passed through the masks at each rotation angle. ................................. 42 19. Figure 19: MCNP Generate Modulation Profile for Cadmium. The...Cadmium. The multi-energetic neutron source simulation from MCNP is used for this plot. The energy is values are shown per energy bin. The
MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forster, R.A.; Little, R.C.; Briesmeister, J.F.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capabilitymore » of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.« less
Comparison of CdZnTe neutron detector models using MCNP6 and Geant4
NASA Astrophysics Data System (ADS)
Wilson, Emma; Anderson, Mike; Prendergasty, David; Cheneler, David
2018-01-01
The production of accurate detector models is of high importance in the development and use of detectors. Initially, MCNP and Geant were developed to specialise in neutral particle models and accelerator models, respectively; there is now a greater overlap of the capabilities of both, and it is therefore useful to produce comparative models to evaluate detector characteristics. In a collaboration between Lancaster University, UK, and Innovative Physics Ltd., UK, models have been developed in both MCNP6 and Geant4 of Cadmium Zinc Telluride (CdZnTe) detectors developed by Innovative Physics Ltd. Herein, a comparison is made of the relative strengths of MCNP6 and Geant4 for modelling neutron flux and secondary γ-ray emission. Given the increasing overlap of the modelling capabilities of MCNP6 and Geant4, it is worthwhile to comment on differences in results for simulations which have similarities in terms of geometries and source configurations.
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Visualization of JPEG Metadata
NASA Astrophysics Data System (ADS)
Malik Mohamad, Kamaruddin; Deris, Mustafa Mat
There are a lot of information embedded in JPEG image than just graphics. Visualization of its metadata would benefit digital forensic investigator to view embedded data including corrupted image where no graphics can be displayed in order to assist in evidence collection for cases such as child pornography or steganography. There are already available tools such as metadata readers, editors and extraction tools but mostly focusing on visualizing attribute information of JPEG Exif. However, none have been done to visualize metadata by consolidating markers summary, header structure, Huffman table and quantization table in a single program. In this paper, metadata visualization is done by developing a program that able to summarize all existing markers, header structure, Huffman table and quantization table in JPEG. The result shows that visualization of metadata helps viewing the hidden information within JPEG more easily.
The MCNP-DSP code for calculations of time and frequency analysis parameters for subcritical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, T.E.; Mihalczo, J.T.
1995-12-31
This paper describes a modified version of the MCNP code, the MCNP-DSP. Variance reduction features were disabled to have strictly analog particle tracking in order to follow fluctuating processes more accurately. Some of the neutron and photon physics routines were modified to better represent the production of particles. Other modifications are discussed.
Lecture Notes on Criticality Safety Validation Using MCNP & Whisper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – C k's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usagemore » are discussed.« less
Evaluation of RAPID for a UNF cask benchmark problem
NASA Astrophysics Data System (ADS)
Mascolino, Valerio; Haghighat, Alireza; Roskoff, Nathan J.
2017-09-01
This paper examines the accuracy and performance of the RAPID (Real-time Analysis for Particle transport and In-situ Detection) code system for the simulation of a used nuclear fuel (UNF) cask. RAPID is capable of determining eigenvalue, subcritical multiplication, and pin-wise, axially-dependent fission density throughout a UNF cask. We study the source convergence based on the analysis of the different parameters used in an eigenvalue calculation in the MCNP Monte Carlo code. For this study, we consider a single assembly surrounded by absorbing plates with reflective boundary conditions. Based on the best combination of eigenvalue parameters, a reference MCNP solution for the single assembly is obtained. RAPID results are in excellent agreement with the reference MCNP solutions, while requiring significantly less computation time (i.e., minutes vs. days). A similar set of eigenvalue parameters is used to obtain a reference MCNP solution for the whole UNF cask. Because of time limitation, the MCNP results near the cask boundaries have significant uncertainties. Except for these, the RAPID results are in excellent agreement with the MCNP predictions, and its computation time is significantly lower, 35 second on 1 core versus 9.5 days on 16 cores.
Bahreyni Toossi, M T; Moradi, H; Zare, H
2008-01-01
In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
Enger, Shirin A; Munck af Rosenschöld, Per; Rezaei, Arash; Lundqvist, Hans
2006-02-01
GEANT4 is a Monte Carlo code originally implemented for high-energy physics applications and is well known for particle transport at high energies. The capacity of GEANT4 to simulate neutron transport in the thermal energy region is not equally well known. The aim of this article is to compare MCNP, a code commonly used in low energy neutron transport calculations and GEANT4 with experimental results and select the suitable code for gadolinium neutron capture applications. To account for the thermal neutron scattering from chemically bound atoms [S(alpha,beta)] in biological materials a comparison of thermal neutron fluence in tissue-like poly(methylmethacrylate) phantom is made with MCNP4B, GEANT4 6.0 patch1, and measurements from the neutron capture therapy (NCT) facility at the Studsvik, Sweden. The fluence measurements agreed with MCNP calculated results considering S(alpha,beta). The location of the thermal neutron peak calculated with MCNP without S(alpha,beta) and GEANT4 is shifted by about 0.5 cm towards a shallower depth and is 25%-30% lower in amplitude. Dose distribution from the gadolinium neutron capture reaction is then simulated by MCNP and compared with measured data. The simulations made by MCNP agree well with experimental results. As long as thermal neutron scattering from chemically bound atoms are not included in GEANT4 it is not suitable for NCT applications.
MCNP output data analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-12-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. New version program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 150 927 No. of bytes in distributed program, including test data, etc.: 4 981 633 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PCs Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 Catalogue identifier of previous version: AEGA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1161 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: The output of a MCNP simulation is an ascii file. The data processing is usually performed by copying and pasting the relevant parts of the ascii file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Reasons for new version: For applications involving the Associate Particle Technique, a large number of gamma rays are produced by the fast neutrons interactions. To study the energy spectra, it is useful to identify the gamma-ray energy peaks in a straightforward way. Therefore, the possibility to show gamma rays corresponding to specific reactions has been added in MODAR. Summary of revisions: It is possible to use a gamma ray database to better identify in the energy spectra gamma ray peaks with their first and second escapes. Histograms can be scaled by the number of source particle to evaluate the number of counts that is expected without statistical uncertainties. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two dimensional data. Running time: The CPU time needed to smear a two dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
Visual-Motor Test Performance: Race and Achievement Variables.
ERIC Educational Resources Information Center
Fuller, Gerald B.; Friedrich, Douglas
1979-01-01
Rural Black and White children of variant academic achievement were tested on the Minnesota Percepto-Diagnostic Test, which consists of six gestalt designs for the subject to copy. Analyses resulted only in a significant achievement effect; when intellectual level was statistically controlled, race was not a significant variable. (Editor/SJL)
Media Studies: Texts and Supplements.
ERIC Educational Resources Information Center
Curriculum Review, 1979
1979-01-01
The 24 reviews in this article include textbooks on journalism and media studies; multimedia kits on advertising, TV news, reporting, and the "grammar" of media; resources on making ad interpreting films in the classroom; supplements on writing for both print and nonprint media; and professional references on improving visual literacy. (Editor)
Southern Identity in "Southern Living" Magazine
ERIC Educational Resources Information Center
Lauder, Tracy
2012-01-01
A fantasy-theme analysis of the editors' letters in "Southern Living" magazine shows an editorial vision of valuing the past and showcasing unique regional qualities. In addition, a content analysis of the visual representation of race in the magazine's formative years and recent past validates that inhabitants of the region were portrayed…
Monte Carlo simulations of medical imaging modalities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estes, G.P.
Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computermore » power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.« less
Biographer: web-based editing and rendering of SBGN compliant biochemical networks.
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-06-01
The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goorley, John T.
2012-06-25
We, the development teams for MCNP, NJOY, and parts of ENDF, would like to invite you to a proposed 3 day workshop October 30, 31 and November 1 2012, to be held at Los Alamos National Laboratory. At this workshop, we will review new and developing missions that MCNP6 and the underlying nuclear data are being asked to address. LANL will also present its internal plans to address these missions and recent advances in these three capabilities and we will be interested to hear your input on these topics. Additionally we are interested in hearing from you additional technical advances,more » missions, concerns, and other issues that we should be considering for both short term (1-3 years) and long term (4-6 years)? What are the additional existing capabilities and methods that we should be investigating? The goal of the workshop is to refine priorities for mcnp6 transport methods, algorithms, physics, data and processing as they relate to the intersection of MCNP, NJOY and ENDF.« less
Comparison of EGS4 and MCNP Monte Carlo codes when calculating radiotherapy depth doses.
Love, P A; Lewis, D G; Al-Affan, I A; Smith, C W
1998-05-01
The Monte Carlo codes EGS4 and MCNP have been compared when calculating radiotherapy depth doses in water. The aims of the work were to study (i) the differences between calculated depth doses in water for a range of monoenergetic photon energies and (ii) the relative efficiency of the two codes for different electron transport energy cut-offs. The depth doses from the two codes agree with each other within the statistical uncertainties of the calculations (1-2%). The relative depth doses also agree with data tabulated in the British Journal of Radiology Supplement 25. A discrepancy in the dose build-up region may by attributed to the different electron transport algorithims used by EGS4 and MCNP. This discrepancy is considerably reduced when the improved electron transport routines are used in the latest (4B) version of MCNP. Timing calculations show that EGS4 is at least 50% faster than MCNP for the geometries used in the simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2017-01-26
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation [1-3]. It uses the sensitivity profile data for an application as computed by MCNP6 [4-6] along with covariance files [7,8] for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014 [3]. During 2015- 2016, Whisper was updated to version 1.1 and is to be included with the upcoming release of MCNP6.2. This document describes the user input and options for running whisper-1.1, including 2 perl utility scripts that simplifymore » ordinary NCS work, whisper_mcnp.pl and whisper_usl.pl. For many detailed references on the theory, applications, nuclear data & covariances, SQA, verification-validation, adjointbased methods for sensitivity-uncertainty analysis, and more – see the Whisper – NCS Validation section of the MCNP Reference Collection at mcnp.lanl.gov. There are currently over 50 Whisper reference documents available.« less
Abrefah, R G; Sogbadji, R B M; Ampomah-Amoako, E; Birikorang, S A; Odoi, H C; Nyarko, B J B
2011-01-01
The MCNP model for the Ghana Research Reactor-1 was redesigned to incorporate a boron carbide-shielded irradiation channel in one of the outer irradiation channels. Extensive investigations were made before arriving at the final design of only one boron carbide covered outer irradiation channel; as all the other designs that were considered did not give desirable results of neutronic performance. The concept of redesigning a new MCNP model, which has a boron carbide-shielded channel is to equip the Ghana Research Reactor-1 with the means of performing efficient epithermal neutron activation analysis. After the simulation, a comparison of the results from the original MCNP model for the Ghana Research Reactor-1 and the new redesigned model of the boron carbide shielded channel was made. The final effective criticality of the original MCNP model for the GHARR-1 was recorded as 1.00402 while that of the new boron carbide designed model was recorded as 1.00282. Also, a final prompt neutron lifetime of 1.5245 × 10(-4)s was recorded for the new boron carbide designed model while a value of 1.5571 × 10(-7)s was recorded for the original MCNP design of the GHARR-1. Copyright © 2010 Elsevier Ltd. All rights reserved.
Natto, S A; Lewis, D G; Ryde, S J
1998-01-01
The Monte Carlo computer code MCNP (version 4A) has been used to develop a personal computer-based model of the Swansea in vivo neutron activation analysis (IVNAA) system. The model included specification of the neutron source (252Cf), collimators, reflectors and shielding. The MCNP model was 'benchmarked' against fast neutron and thermal neutron fluence data obtained experimentally from the IVNAA system. The Swansea system allows two irradiation geometries using 'short' and 'long' collimators, which provide alternative dose rates for IVNAA. The data presented here relate to the short collimator, although results of similar accuracy were obtained using the long collimator. The fast neutron fluence was measured in air at a series of depths inside the collimator. The measurements agreed with the MCNP simulation within the statistical uncertainty (5-10%) of the calculations. The thermal neutron fluence was measured and calculated inside the cuboidal water phantom. The depth of maximum thermal fluence was 3.2 cm (measured) and 3.0 cm (calculated). The width of the 50% thermal fluence level across the phantom at its mid-depth was found to be the same by both MCNP and experiment. This benchmarking exercise has given us a high degree of confidence in MCNP as a tool for the design of IVNAA systems.
MCNP-REN - A Monte Carlo Tool for Neutron Detector Design Without Using the Point Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.E.; Baker, M.C.
1999-07-25
The development of neutron detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model fails to accurately predict detector response in common applications. For this reason, the general Monte Carlo N-Particle code (MCNP) was modified to simulate the pulse streams that would be generated by a neutron detector and normally analyzed by a shift register. This modified code, MCNP - Random Exponentially Distributed Neutron Source (MCNP-REN), along with the Time Analysis Program (TAP) predict neutron detector response without using the pointmore » reactor model, making it unnecessary for the user to decide whether or not the assumptions of the point model are met for their application. MCNP-REN is capable of simulating standard neutron coincidence counting as well as neutron multiplicity counting. Measurements of MOX fresh fuel made using the Underwater Coincidence Counter (UWCC) as well as measurements of HEU reactor fuel using the active neutron Research Reactor Fuel Counter (RRFC) are compared with calculations. The method used in MCNP-REN is demonstrated to be fundamentally sound and shown to eliminate the need to use the point model for detector performance predictions.« less
Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S
2016-03-08
Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.
2014-03-27
VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6
2007-09-01
performance of the detector, and to compare the performance with sodium iodide and germanium detectors. Monte Carlo ( MCNP ) simulation was used to...aluminum ~50% more efficient), and to estimate optimum shield dimensions for an HPXe based nuclear explosion monitor. MCNP modeling was also used to...detector were calculated with MCNP by using input activity levels as measured in routine NEM runs at Pacific Northwest National Laboratory (PNNL
Modeling of Radioxenon Production and Release Pathways
2010-09-01
MCNP is utilized to model the neutron transport, while ORIGEN 2.2 is utilized to calculate the production and decay of fission products, activation...products, and transuranics resulting from the calculated neutron flux profile. MONTEBURNS is a pearl script that couples the MCNP and ORIGEN 2.2...core enriched to 93.80% 239Pu was used. MCNP was used to determine the thickness of soil or rock necessary to accurately model the attenuation and
Montgomery Point Lock and Dam, White River, Arkansas
2016-01-01
ER D C/ CH L TR -1 6- 1 Monitoring Completed Navigation Projects (MCNP) Program Montgomery Point Lock and Dam, White River, Arkansas Co...Navigation Projects (MCNP) Program ERDC/CHL TR-16-1 January 2016 Montgomery Point Lock and Dam, White River, Arkansas Allen Hammack, Michael Winkler, and...20314-1000 Under MCNP Work Unit: Montgomery Point Lock and Dam, White River, Arkansas ERDC/CHL TR-16-1 ii Abstract Montgomery Point Lock and
Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code
NASA Astrophysics Data System (ADS)
Faghihi, F.; Mehdizadeh, S.; Hadad, K.
Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.
Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.
2016-01-01
Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460
Comparison of scientific computing platforms for MCNP4A Monte Carlo calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.S.; Brockhoff, R.C.
1994-04-01
The performance of seven computer platforms is evaluated with the widely used and internationally available MCNP4A Monte Carlo radiation transport code. All results are reproducible and are presented in such a way as to enable comparison with computer platforms not in the study. The authors observed that the HP/9000-735 workstation runs MCNP 50% faster than the Cray YMP 8/64. Compared with the Cray YMP 8/64, the IBM RS/6000-560 is 68% as fast, the Sun Sparc10 is 66% as fast, the Silicon Graphics ONYX is 90% as fast, the Gateway 2000 model 4DX2-66V personal computer is 27% as fast, and themore » Sun Sparc2 is 24% as fast. In addition to comparing the timing performance of the seven platforms, the authors observe that changes in compilers and software over the past 2 yr have resulted in only modest performance improvements, hardware improvements have enhanced performance by less than a factor of [approximately]3, timing studies are very problem dependent, MCNP4Q runs about as fast as MCNP4.« less
Verification of MCNP simulation of neutron flux parameters at TRIGA MK II reactor of Malaysia.
Yavar, A R; Khalafi, H; Kasesaz, Y; Sarmani, S; Yahaya, R; Wood, A K; Khoo, K S
2012-10-01
A 3-D model for 1 MW TRIGA Mark II research reactor was simulated. Neutron flux parameters were calculated using MCNP-4C code and were compared with experimental results obtained by k(0)-INAA and absolute method. The average values of φ(th),φ(epi), and φ(fast) by MCNP code were (2.19±0.03)×10(12) cm(-2)s(-1), (1.26±0.02)×10(11) cm(-2)s(-1) and (3.33±0.02)×10(10) cm(-2)s(-1), respectively. These average values were consistent with the experimental results obtained by k(0)-INAA. The findings show a good agreement between MCNP code results and experimental results. Copyright © 2012 Elsevier Ltd. All rights reserved.
CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.
Saegusa, Jun
2008-01-01
The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.
2014-04-21
Dixon, a graduate student at the University of New Mexico who introduced us to MCNP . Using what we learned from Dixon, we were able to produce a...curves were produced with MCNP for incident electron energies from 10 to 100 keV in increments of 10 keV, see Figure 9. In this case, the same...the algorithm. Since MCNP does take backscatter into consideration, the comparisons on the vertical scales (energy or number of electrons deposited
Simplification of an MCNP model designed for dose rate estimation
NASA Astrophysics Data System (ADS)
Laptev, Alexander; Perry, Robert
2017-09-01
A study was made to investigate the methods of building a simplified MCNP model for radiological dose estimation. The research was done using an example of a complicated glovebox with extra shielding. The paper presents several different calculations for neutron and photon dose evaluations where glovebox elements were consecutively excluded from the MCNP model. The analysis indicated that to obtain a fast and reasonable estimation of dose, the model should be realistic in details that are close to the tally. Other details may be omitted.
Photograph + Printed Word: A New Language for the Student Journalist.
ERIC Educational Resources Information Center
Magmer, James
This document examines the use of photography and the printed word to make visual statements in student publications. It is written for journalists who are writers and editors as well as for photojournalists and for student journalists interested in increasing the quality of the school newspaper, magazine, or yearbook. The role of the photographer…
Biographer: web-based editing and rendering of SBGN compliant biochemical networks
Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas
2013-01-01
Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737
Features of MCNP6 Relevant to Medical Radiation Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, H. Grady III; Goorley, John T.
2012-08-29
MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less
Rudresh, Shoorashetty Manohara; Ravi, Giriyapur Siddappa; Sunitha, Lakshminarayanappa; Hajira, Sadiya Noor; Kalaiarasan, Ellappan; Harish, Belgode Narasimha
2017-01-01
PURPOSE: Detection of carbapenemases among Gram-negative bacteria (GNB) is important for both clinicians and infection control practitioners. The Clinical and Laboratory Standards Institute recommends Carba NP (CNP) as confirmatory test for carbapenemase production. The reagents required for CNP test are costly and hence the test cannot be performed on a routine basis. The present study evaluates modifications of CNP test for rapid detection of carbapenemases among GNB. MATERIALS AND METHODS: The GNB were screened for carbapenemase production using CNP, CarbAcineto NP (CANP), and modified CNP (mCNP) test. A multiplex polymerase chain reaction (PCR) was performed on all the carbapenem-resistant bacteria for carbapenemase genes. The results of three phenotypic tests were compared with PCR. RESULTS: A total of 765 gram negative bacteria were screened for carbapenem resistance. Carbapenem resistance was found in 144 GNB. The metallo-β-lactamases were most common carbapenemases followed by OXA-48-like enzymes. The CANP test was most sensitive (80.6%) for carbapenemases detection. The mCNP test was 62.1% sensitive for detection of carbapenemases. The mCNP, CNP, and CANP tests were equally sensitive (95%) for detection of NDM enzymes among Enterobacteriaceae. The mCNP test had poor sensitivity for detection of OXA-48-like enzymes. CONCLUSION: The mCNP test was rapid, cost-effective, and easily adoptable on routine basis. The early detection of carbapenemases using mCNP test will help in preventing the spread of multidrug-resistant organisms in the hospital settings. PMID:28966495
Rudresh, Shoorashetty Manohara; Ravi, Giriyapur Siddappa; Sunitha, Lakshminarayanappa; Hajira, Sadiya Noor; Kalaiarasan, Ellappan; Harish, Belgode Narasimha
2017-01-01
Detection of carbapenemases among Gram-negative bacteria (GNB) is important for both clinicians and infection control practitioners. The Clinical and Laboratory Standards Institute recommends Carba NP (CNP) as confirmatory test for carbapenemase production. The reagents required for CNP test are costly and hence the test cannot be performed on a routine basis. The present study evaluates modifications of CNP test for rapid detection of carbapenemases among GNB. The GNB were screened for carbapenemase production using CNP, CarbAcineto NP (CANP), and modified CNP (mCNP) test. A multiplex polymerase chain reaction (PCR) was performed on all the carbapenem-resistant bacteria for carbapenemase genes. The results of three phenotypic tests were compared with PCR. A total of 765 gram negative bacteria were screened for carbapenem resistance. Carbapenem resistance was found in 144 GNB. The metallo-β-lactamases were most common carbapenemases followed by OXA-48-like enzymes. The CANP test was most sensitive (80.6%) for carbapenemases detection. The mCNP test was 62.1% sensitive for detection of carbapenemases. The mCNP, CNP, and CANP tests were equally sensitive (95%) for detection of NDM enzymes among Enterobacteriaceae. The mCNP test had poor sensitivity for detection of OXA-48-like enzymes. The mCNP test was rapid, cost-effective, and easily adoptable on routine basis. The early detection of carbapenemases using mCNP test will help in preventing the spread of multidrug-resistant organisms in the hospital settings.
Chemozart: a web-based 3D molecular structure editor and visualizer platform.
Mohebifar, Mohamad; Sajadi, Fatemehsadat
2015-01-01
Chemozart is a 3D Molecule editor and visualizer built on top of native web components. It offers an easy to access service, user-friendly graphical interface and modular design. It is a client centric web application which communicates with the server via a representational state transfer style web service. Both client-side and server-side application are written in JavaScript. A combination of JavaScript and HTML is used to draw three-dimensional structures of molecules. With the help of WebGL, three-dimensional visualization tool is provided. Using CSS3 and HTML5, a user-friendly interface is composed. More than 30 packages are used to compose this application which adds enough flexibility to it to be extended. Molecule structures can be drawn on all types of platforms and is compatible with mobile devices. No installation is required in order to use this application and it can be accessed through the internet. This application can be extended on both server-side and client-side by implementing modules in JavaScript. Molecular compounds are drawn on the HTML5 Canvas element using WebGL context. Chemozart is a chemical platform which is powerful, flexible, and easy to access. It provides an online web-based tool used for chemical visualization along with result oriented optimization for cloud based API (application programming interface). JavaScript libraries which allow creation of web pages containing interactive three-dimensional molecular structures has also been made available. The application has been released under Apache 2 License and is available from the project website https://chemozart.com.
TORT/MCNP coupling method for the calculation of neutron flux around a core of BWR.
Kurosawa, Masahiko
2005-01-01
For the analysis of BWR neutronics performance, accurate data are required for neutron flux distribution over the In-Reactor Pressure Vessel equipments taking into account the detailed geometrical arrangement. The TORT code can calculate neutron flux around a core of BWR in a three-dimensional geometry model, but has difficulties in fine geometrical modelling and lacks huge computer resource. On the other hand, the MCNP code enables the calculation of the neutron flux with a detailed geometry model, but requires very long sampling time to give enough number of particles. Therefore, a TORT/MCNP coupling method has been developed to eliminate the two problems mentioned above in each code. In this method, the TORT code calculates angular flux distribution on the core surface and the MCNP code calculates neutron spectrum at the points of interest using the flux distribution. The coupling method will be used as the DOT-DOMINO-MORSE code system. This TORT/MCNP coupling method was applied to calculate the neutron flux at points where induced radioactivity data were measured for 54Mn and 60Co and the radioactivity calculations based on the neutron flux obtained from the above method were compared with the measured data.
Chen, A Y; Liu, Y-W H; Sheu, R J
2008-01-01
This study investigates the radiation shielding design of the treatment room for boron neutron capture therapy at Tsing Hua Open-pool Reactor using "TORT-coupled MCNP" method. With this method, the computational efficiency is improved significantly by two to three orders of magnitude compared to the analog Monte Carlo MCNP calculation. This makes the calculation feasible using a single CPU in less than 1 day. Further optimization of the photon weight windows leads to additional 50-75% improvement in the overall computational efficiency.
Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.
Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C
2004-01-01
Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.
Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.
Henry, R; Tiselj, I; Snoj, L
2015-03-01
New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gamma-ray spectroscopy measurements and simulations for uranium mining
NASA Astrophysics Data System (ADS)
Marchais, T.; Pérot, B.; Carasco, C.; Allinei, P.-G.; Chaussonnet, P.; Ma, J.-L.; Toubon, H.
2018-01-01
AREVA Mines and the Nuclear Measurement Laboratory of CEA Cadarache are collaborating to improve the sensitivity and precision of uranium concentration evaluation by means of gamma measurements. This paper reports gamma-ray spectra, recorded with a high-purity coaxial germanium detector, on standard cement blocks with increasing uranium content, and the corresponding MCNP simulations. The detailed MCNP model of the detector and experimental setup has been validated by calculation vs. experiment comparisons. An optimization of the detector MCNP model is presented in this paper, as well as a comparison of different nuclear data libraries to explain missing or exceeding peaks in the simulation. Energy shifts observed between the fluorescence X-rays produced by MCNP and atomic data are also investigated. The qualified numerical model will be used in further studies to develop new gamma spectroscopy approaches aiming at reducing acquisition times, especially for ore samples with low uranium content.
A CT and MRI scan to MCNP input conversion program.
Van Riper, Kenneth A
2005-01-01
We describe a new program to read a sequence of tomographic scans and prepare the geometry and material sections of an MCNP input file. Image processing techniques include contrast controls and mapping of grey scales to colour. The user interface provides several tools with which the user can associate a range of image intensities to an MCNP material. Materials are loaded from a library. A separate material assignment can be made to a pixel intensity or range of intensities when that intensity dominates the image boundaries; this material is assigned to all pixels with that intensity contiguous with the boundary. Material fractions are computed in a user-specified voxel grid overlaying the scans. New materials are defined by mixing the library materials using the fractions. The geometry can be written as an MCNP lattice or as individual cells. A combination algorithm can be used to join neighbouring cells with the same material.
NASA Astrophysics Data System (ADS)
Elbashir, B. O.; Dong, M. G.; Sayyed, M. I.; Issa, Shams A. M.; Matori, K. A.; Zaid, M. H. M.
2018-06-01
The mass attenuation coefficients (μ/ρ), effective atomic numbers (Zeff) and electron densities (Ne) of some amino acids obtained experimentally by the other researchers have been calculated using MCNP5 simulations in the energy range 0.122-1.330 MeV. The simulated values of μ/ρ, Zeff, and Ne were compared with the previous experimental work for the amino acids samples and a good agreement was noticed. Moreover, the values of mean free path (MFP) for the samples were calculated using MCNP5 program and compared with the theoretical results obtained by XCOM. The investigation of μ/ρ, Zeff, Ne and MFP values of amino acids using MCNP5 simulations at various photon energies when compared with the XCOM values and previous experimental data for the amino acids samples revealed that MCNP5 code provides accurate photon interaction parameters for amino acids.
NASA Astrophysics Data System (ADS)
Esfandiari, M.; Shirmardi, S. P.; Medhat, M. E.
2014-06-01
In this study, element analysis and the mass attenuation coefficient for matrixes of gold, bronze and water with various impurities and the concentrations of heavy metals (Cu, Mn, Pb and Zn) are evaluated and calculated by the MCNP simulation code for photons emitted from Barium-133, Americium-241 and sources with energies between 1 and 100 keV. The MCNP data are compared with the experimental data and WinXCom code simulated results by Medhat. The results showed that the obtained results of bronze and gold matrix are in good agreement with the other methods for energies above 40 and 60 keV, respectively. However for water matrixes with various impurities, there is a good agreement between the three methods MCNP, WinXCom and the experimental one in low and high energies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liegey, Lauren Rene; Wilcox, Trevor; Mckinney, Gregg Walter
2015-08-07
My internship program was the Domestic Nuclear Detection Office Summer Internship Program. I worked at Los Alamos National Laboratory with Trevor A. Wilcox and Gregg W. McKinney in the NEN-5 group. My project title was “MCNP Physical Model Interoperability & Validation”. The goal of my project was to write a program to predict the solar modulation parameter for dates in the future and then implement it into MCNP6. This update to MCNP6 can be used to calculate the background more precisely, which is an important factor in being able to detect Special Nuclear Material. We will share our work inmore » a published American Nuclear Society (ANS) paper, an ANS presentation, and a LANL student poster session. Through this project, I gained skills in programming, computing, and using MCNP. I also gained experience that will help me decide on a career or perhaps obtain employment in the future.« less
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, Madeline Louise; McMath, Garrett Earl
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
Lockhart, Madeline Louise; McMath, Garrett Earl
2017-10-26
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
A Patch to MCNP5 for Multiplication Inference: Description and User Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, Jr., Clell J.
2014-05-05
A patch to MCNP5 has been written to allow generation of multiple neutrons from a spontaneous-fission event and generate list-mode output. This report documents the implementation and usage of this patch.
Harnessing the web information ecosystem with wiki-based visualization dashboards.
McKeon, Matt
2009-01-01
We describe the design and deployment of Dashiki, a public website where users may collaboratively build visualization dashboards through a combination of a wiki-like syntax and interactive editors. Our goals are to extend existing research on social data analysis into presentation and organization of data from multiple sources, explore new metaphors for these activities, and participate more fully in the web!s information ecology by providing tighter integration with real-time data. To support these goals, our design includes novel and low-barrier mechanisms for editing and layout of dashboard pages and visualizations, connection to data sources, and coordinating interaction between visualizations. In addition to describing these technologies, we provide a preliminary report on the public launch of a prototype based on this design, including a description of the activities of our users derived from observation and interviews.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinilla, Maria Isabel
This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.
Accelerating Pseudo-Random Number Generator for MCNP on GPU
NASA Astrophysics Data System (ADS)
Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu
2010-09-01
Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.
Implementation of a tree algorithm in MCNP code for nuclear well logging applications.
Li, Fusheng; Han, Xiaogang
2012-07-01
The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. Copyright © 2011 Elsevier Ltd. All rights reserved.
An Electron/Photon/Relaxation Data Library for MCNP6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, III, H. Grady
The capabilities of the MCNP6 Monte Carlo code in simulation of electron transport, photon transport, and atomic relaxation have recently been significantly expanded. The enhancements include not only the extension of existing data and methods to lower energies, but also the introduction of new categories of data and methods. Support of these new capabilities has required major additions to and redesign of the associated data tables. In this paper we present the first complete documentation of the contents and format of the new electron-photon-relaxation data library now available with the initial production release of MCNP6.
SABRINA - an interactive geometry modeler for MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T.; Murphy, J.
One of the most difficult tasks when analyzing a complex three-dimensional system with Monte Carlo is geometry model development. SABRINA attempts to make the modeling process more user-friendly and less of an obstacle. It accepts both combinatorial solid bodies and MCNP surfaces and produces MCNP cells. The model development process in SABRINA is highly interactive and gives the user immediate feedback on errors. Users can view their geometry from arbitrary perspectives while the model is under development and interactively find and correct modeling errors. An example of a SABRINA display is shown. It represents a complex three-dimensional shape.
Calculated organ doses for Mayak production association central hall using ICRP and MCNP.
Choe, Dong-Ok; Shelkey, Brenda N; Wilde, Justin L; Walk, Heidi A; Slaughter, David M
2003-03-01
As part of an ongoing dose reconstruction project, equivalent organ dose rates from photons and neutrons were estimated using the energy spectra measured in the central hall above the graphite reactor core located in the Russian Mayak Production Association facility. Reconstruction of the work environment was necessary due to the lack of personal dosimeter data for neutrons in the time period prior to 1987. A typical worker scenario for the central hall was developed for the Monte Carlo Neutron Photon-4B (MCNP) code. The resultant equivalent dose rates for neutrons and photons were compared with the equivalent dose rates derived from calculations using the conversion coefficients in the International Commission on Radiological Protection Publications 51 and 74 in order to validate the model scenario for this Russian facility. The MCNP results were in good agreement with the results of the ICRP publications indicating the modeling scenario was consistent with actual work conditions given the spectra provided. The MCNP code will allow for additional orientations to accurately reflect source locations.
NASA Astrophysics Data System (ADS)
Hartling, K.; Ciungu, B.; Li, G.; Bentoumi, G.; Sur, B.
2018-05-01
Monte Carlo codes such as MCNP and Geant4 rely on a combination of physics models and evaluated nuclear data files (ENDF) to simulate the transport of neutrons through various materials and geometries. The grid representation used to represent the final-state scattering energies and angles associated with neutron scattering interactions can significantly affect the predictions of these codes. In particular, the default thermal scattering libraries used by MCNP6.1 and Geant4.10.3 do not accurately reproduce the ENDF/B-VII.1 model in simulations of the double-differential cross section for thermal neutrons interacting with hydrogen nuclei in a thin layer of water. However, agreement between model and simulation can be achieved within the statistical error by re-processing ENDF/B-VII.I thermal scattering libraries with the NJOY code. The structure of the thermal scattering libraries and sampling algorithms in MCNP and Geant4 are also reviewed.
Adjoint-Based Uncertainty Quantification with MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less
Duggan, Dennis M
2004-12-01
Improved cross-sections in a new version of the Monte-Carlo N-particle (MCNP) code may eliminate discrepancies between radial dose functions (as defined by American Association of Physicists in Medicine Task Group 43) derived from Monte-Carlo simulations of low-energy photon-emitting brachytherapy sources and those from measurements on the same sources with thermoluminescent dosimeters. This is demonstrated for two 125I brachytherapy seed models, the Implant Sciences Model ISC3500 (I-Plant) and the Amersham Health Model 6711, by simulating their radial dose functions with two versions of MCNP, 4c2 and 5.
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform
2012-01-01
Background The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. Results The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Conclusions Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net. PMID:22889332
Avogadro: an advanced semantic chemical editor, visualization, and analysis platform.
Hanwell, Marcus D; Curtis, Donald E; Lonie, David C; Vandermeersch, Tim; Zurek, Eva; Hutchison, Geoffrey R
2012-08-13
The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful plugin mechanism to support new features in organic chemistry, inorganic complexes, drug design, materials, biomolecules, and simulations. Avogadro is freely available under an open-source license from http://avogadro.openmolecules.net.
Cai, Zhongli; Pignol, Jean-Philippe; Chan, Conrad; Reilly, Raymond M
2010-03-01
Our objective was to compare Monte Carlo N-particle (MCNP) self- and cross-doses from (111)In to the nucleus of breast cancer cells with doses calculated by reported analytic methods (Goddu et al. and Farragi et al.). A further objective was to determine whether the MCNP-predicted surviving fraction (SF) of breast cancer cells exposed in vitro to (111)In-labeled diethylenetriaminepentaacetic acid human epidermal growth factor ((111)In-DTPA-hEGF) could accurately predict the experimentally determined values. MCNP was used to simulate the transport of electrons emitted by (111)In from the cell surface, cytoplasm, or nucleus. The doses to the nucleus per decay (S values) were calculated for single cells, closely packed monolayer cells, or cell clusters. The cell and nucleus dimensions of 6 breast cancer cell lines were measured, and cell line-specific S values were calculated. For self-doses, MCNP S values of nucleus to nucleus agreed very well with those of Goddu et al. (ratio of S values using analytic methods vs. MCNP = 0.962-0.995) and Faraggi et al. (ratio = 1.011-1.024). MCNP S values of cytoplasm and cell surface to nucleus compared fairly well with the reported values (ratio = 0.662-1.534 for Goddu et al.; 0.944-1.129 for Faraggi et al.). For cross doses, the S values to the nucleus were independent of (111)In subcellular distribution but increased with cluster size. S values for monolayer cells were significantly different from those of single cells and cell clusters. The MCNP-predicted SF for monolayer MDA-MB-468, MDA-MB-231, and MCF-7 cells agreed with the experimental data (relative error of 3.1%, -1.0%, and 1.7%). The single-cell and cell cluster models were less accurate in predicting the SF. For MDA-MB-468 cells, relative error was 8.1% using the single-cell model and -54% to -67% using the cell cluster model. Individual cell-line dimensions had large effects on S values and were needed to estimate doses and SF accurately. MCNP simulation compared well with the reported analytic methods in the calculation of subcellular S values for single cells and cell clusters. Application of a monolayer model was most accurate in predicting the SF of breast cancer cells exposed in vitro to (111)In-DTPA-hEGF.
Vision Science and Adaptive Optics, The State of the Field
Marcos, Susana; Werner, John S.; Burns, Stephen A; Merigan, William H.; Artal, Pablo; Atchison, David A.; Hampson, Karen M.; Legras, Richard; Lundstrom, Linda; Yoon, Geungyoung; Carroll, Joseph; Choi, Stacey S.; Doble, Nathan; Dubis, Adam M.; Dubra, Alfredo; Elsner, Ann; Jonnal, Ravi; Miller, Donald T.; Paques, Michel; Smithson, Hannah E.; Young, Laura K.; Zhang, Yuhua; Campbell, Melanie; Hunter, Jennifer; Metha, Andrew; Palczewska, Grazyna; Schallek, Jesse; Sincich, Lawrence C.
2017-01-01
Adaptive optics is a relatively new field, yet it is spreading rapidly and allows new questions to be asked about how the visual system is organized. The editors of this feature issue have posed a series of question to scientists involved in using adaptive optics in vision science. The questions are focused on three main areas. In the first we investigate the use of adaptive optics for psychophysical measurements of visual system function and for improving the optics of the eye. In the second, we look at the applications and impact of adaptive optics on retinal imaging and its promise for basic and applied research. In the third, we explore how adaptive optics is being used to improve our understanding of the neurophysiology of the visual system. PMID:28212982
Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.
Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong
2016-08-01
The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.
Case Studies of Navigation Channel and Port Sedimentation
2011-01-14
Engineering Supplement to Limited Reevaluation Report, Vol. II (1995) 1/14/2011 18 MCNP Hypotheses Why is channel shoaling more than anticipated? (1...activities have modified shoaling patterns and magnitudes (trawling, dredging and disposal, subsidence/fluid withdrawal). MCNP Hypotheses Why is channel
LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.
Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat
2009-08-01
To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.
Gamma irradiator dose mapping simulation using the MCNP code and benchmarking with dosimetry.
Sohrabpour, M; Hassanzadeh, M; Shahriari, M; Sharifzadeh, M
2002-10-01
The Monte Carlo transport code, MCNP, has been applied in simulating dose rate distribution in the IR-136 gamma irradiator system. Isodose curves, cumulative dose values, and system design data such as throughputs, over-dose-ratios, and efficiencies have been simulated as functions of product density. Simulated isodose curves, and cumulative dose values were compared with dosimetry values obtained using polymethyle-methacrylate, Fricke, ethanol-chlorobenzene, and potassium dichromate dosimeters. The produced system design data were also found to agree quite favorably with those of the system manufacturer's data. MCNP has thus been found to be an effective transport code for handling of various dose mapping excercises for gamma irradiators.
Preliminary Experiments with a Triple-Layer Phoswich Detector for Radioxenon Detection
2008-09-01
Figure 7b; with a significant attenuation which was predicted by our MCNP modeling (Farsoni et al., 2007). The 81 keV peak in the NaI spectrum has a...analysis technique and confirmed our previous MCNP modeling. Our future work includes use of commercially available radioxenon gas (133Xe) to test
Neutronics Analyses of the Minimum Original HEU TREAT Core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontogeorgakos, D.; Connaway, H.; Yesilyurt, G.
2014-04-01
This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high-enriched uranium (HEU) fuel to the use of low-enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to validate the MCNP model of the TREAT reactor with the well-documented measurements which were taken during the start-up and early operation of TREAT. Furthermore, the effect of carbon graphitization was also addressed. The graphitization level was assumedmore » to be 100% (ANL/GTRI/TM-13/4). For this purpose, a set of experiments was chosen to validate the TREAT MCNP model, involving the approach to criticality procedure, in-core neutron flux measurements with foils, and isothermal temperature coefficient and temperature distribution measurements. The results of this study extended the knowledge base for the TREAT MCNP calculations and established the credibility of the MCNP model to be used in the core conversion feasibility analysis.« less
Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6
NASA Astrophysics Data System (ADS)
Ratliff, Hunter N.; Smith, Michael B. R.; Heilbronn, Lawrence
2017-08-01
The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 μGy/day while RAD measured 233 μGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 μSv/day while RAD reported 710 μSv/day.
An investigation of voxel geometries for MCNP-based radiation dose calculations.
Zhang, Juying; Bednarz, Bryan; Xu, X George
2006-11-01
Voxelized geometry such as those obtained from medical images is increasingly used in Monte Carlo calculations of absorbed doses. One useful application of calculated absorbed dose is the determination of fluence-to-dose conversion factors for different organs. However, confusion still exists about how such a geometry is defined and how the energy deposition is best computed, especially involving a popular code, MCNP5. This study investigated two different types of geometries in the MCNP5 code, cell and lattice definitions. A 10 cm x 10 cm x 10 cm test phantom, which contained an embedded 2 cm x 2 cm x 2 cm target at its center, was considered. A planar source emitting parallel photons was also considered in the study. The results revealed that MCNP5 does not calculate total target volume for multi-voxel geometries. Therefore, tallies which involve total target volume must be divided by the user by the total number of voxels to obtain a correct dose result. Also, using planar source areas greater than the phantom size results in the same fluence-to-dose conversion factor.
NASA Astrophysics Data System (ADS)
Marashdeh, Mohammad W.; Al-Hamarneh, Ibrahim F.; Abdel Munem, Eid M.; Tajuddin, A. A.; Ariffin, Alawiah; Al-Omari, Saleh
Rhizophora spp. wood has the potential to serve as a solid water or tissue equivalent phantom for photon and electron beam dosimetry. In this study, the effective atomic number (Zeff) and effective electron density (Neff) of raw wood and binderless Rhizophora spp. particleboards in four different particle sizes were determined in the 10-60 keV energy region. The mass attenuation coefficients used in the calculations were obtained using the Monte Carlo N-Particle (MCNP5) simulation code. The MCNP5 calculations of the attenuation parameters for the Rhizophora spp. samples were plotted graphically against photon energy and discussed in terms of their relative differences compared with those of water and breast tissue. Moreover, the validity of the MCNP5 code was examined by comparing the calculated attenuation parameters with the theoretical values obtained by the XCOM program based on the mixture rule. The results indicated that the MCNP5 process can be followed to determine the attenuation of gamma rays with several photon energies in other materials.
PathVisio 3: an extendable pathway analysis toolbox.
Kutmon, Martina; van Iersel, Martijn P; Bohler, Anwesha; Kelder, Thomas; Nunes, Nuno; Pico, Alexander R; Evelo, Chris T
2015-02-01
PathVisio is a commonly used pathway editor, visualization and analysis software. Biological pathways have been used by biologists for many years to describe the detailed steps in biological processes. Those powerful, visual representations help researchers to better understand, share and discuss knowledge. Since the first publication of PathVisio in 2008, the original paper was cited more than 170 times and PathVisio was used in many different biological studies. As an online editor PathVisio is also integrated in the community curated pathway database WikiPathways. Here we present the third version of PathVisio with the newest additions and improvements of the application. The core features of PathVisio are pathway drawing, advanced data visualization and pathway statistics. Additionally, PathVisio 3 introduces a new powerful extension systems that allows other developers to contribute additional functionality in form of plugins without changing the core application. PathVisio can be downloaded from http://www.pathvisio.org and in 2014 PathVisio 3 has been downloaded over 5,500 times. There are already more than 15 plugins available in the central plugin repository. PathVisio is a freely available, open-source tool published under the Apache 2.0 license (http://www.apache.org/licenses/LICENSE-2.0). It is implemented in Java and thus runs on all major operating systems. The code repository is available at http://svn.bigcat.unimaas.nl/pathvisio. The support mailing list for users is available on https://groups.google.com/forum/#!forum/wikipathways-discuss and for developers on https://groups.google.com/forum/#!forum/wikipathways-devel.
Source terms, shielding calculations and soil activation for a medical cyclotron.
Konheiser, J; Naumann, B; Ferrari, A; Brachem, C; Müller, S E
2016-12-01
Calculations of the shielding and estimates of soil activation for a medical cyclotron are presented in this work. Based on the neutron source term from the 18 O(p,n) 18 F reaction produced by a 28 MeV proton beam, neutron and gamma dose rates outside the building were estimated with the Monte Carlo code MCNP6 (Goorley et al 2012 Nucl. Technol. 180 298-315). The neutron source term was calculated with the MCNP6 code and FLUKA (Ferrari et al 2005 INFN/TC_05/11, SLAC-R-773) code as well as with supplied data by the manufacturer. MCNP and FLUKA calculations yielded comparable results, while the neutron yield obtained using the manufacturer-supplied information is about a factor of 5 smaller. The difference is attributed to the missing channels in the manufacturer-supplied neutron source terms which considers only the 18 O(p,n) 18 F reaction, whereas the MCNP and FLUKA calculations include additional neutron reaction channels. Soil activation was performed using the FLUKA code. The estimated dose rate based on MCNP6 calculations in the public area is about 0.035 µSv h -1 and thus significantly below the reference value of 0.5 µSv h -1 (2011 Strahlenschutzverordnung, 9 Auflage vom 01.11.2011, Bundesanzeiger Verlag). After 5 years of continuous beam operation and a subsequent decay time of 30 d, the activity concentration of the soil is about 0.34 Bq g -1 .
Shahbazi-Gahrouei, Daryoush; Ayat, Saba
2012-01-01
Radioiodine therapy is an effective method for treating thyroid cancer carcinoma, but it has some affects on normal tissues, hence dosimetry of vital organs is important to weigh the risks and benefits of this method. The aim of this study is to measure the absorbed doses of important organs by Monte Carlo N Particle (MCNP) simulation and comparing the results of different methods of dosimetry by performing a t-paired test. To calculate the absorbed dose of thyroid, sternum, and cervical vertebra using the MCNP code, *F8 tally was used. Organs were simulated by using a neck phantom and Medical Internal Radiation Dosimetry (MIRD) method. Finally, the results of MCNP, MIRD, and Thermoluminescent dosimeter (TLD) measurements were compared by SPSS software. The absorbed dose obtained by Monte Carlo simulations for 100, 150, and 175 mCi administered 131I was found to be 388.0, 427.9, and 444.8 cGy for thyroid, 208.7, 230.1, and 239.3 cGy for sternum and 272.1, 299.9, and 312.1 cGy for cervical vertebra. The results of paired t-test were 0.24 for comparing TLD dosimetry and MIRD calculation, 0.80 for MCNP simulation and MIRD, and 0.19 for TLD and MCNP. The results showed no significant differences among three methods of Monte Carlo simulations, MIRD calculation and direct experimental dosimetry using TLD. PMID:23717806
Mashnik, Stepan Georgievich; Kerby, Leslie Marie; Gudima, Konstantin K.; ...
2017-03-23
We extend the cascade-exciton model (CEM), and the Los Alamos version of the quark-gluon string model (LAQGSM), event generators of the Monte Carlo N-particle transport code version 6 (MCNP6), to describe production of energetic light fragments (LF) heavier than 4He from various nuclear reactions induced by particles and nuclei at energies up to about 1 TeV/nucleon. In these models, energetic LF can be produced via Fermi breakup, preequilibrium emission, and coalescence of cascade particles. Initially, we study several variations of the Fermi breakup model and choose the best option for these models. Then, we extend the modified exciton model (MEM)more » used by these codes to account for a possibility of multiple emission of up to 66 types of particles and LF (up to 28Mg) at the preequilibrium stage of reactions. Then, we expand the coalescence model to allow coalescence of LF from nucleons emitted at the intranuclear cascade stage of reactions and from lighter clusters, up to fragments with mass numbers A ≤ 7, in the case of CEM, and A ≤ 12, in the case of LAQGSM. Next, we modify MCNP6 to allow calculating and outputting spectra of LF and heavier products with arbitrary mass and charge numbers. The improved version of CEM is implemented into MCNP6. Lastly, we test the improved versions of CEM, LAQGSM, and MCNP6 on a variety of measured nuclear reactions. The modified codes give an improved description of energetic LF from particle- and nucleus-induced reactions; showing a good agreement with a variety of available experimental data. They have an improved predictive power compared to the previous versions and can be used as reliable tools in simulating applications involving such types of reactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mashnik, Stepan Georgievich; Kerby, Leslie Marie; Gudima, Konstantin K.
We extend the cascade-exciton model (CEM), and the Los Alamos version of the quark-gluon string model (LAQGSM), event generators of the Monte Carlo N-particle transport code version 6 (MCNP6), to describe production of energetic light fragments (LF) heavier than 4He from various nuclear reactions induced by particles and nuclei at energies up to about 1 TeV/nucleon. In these models, energetic LF can be produced via Fermi breakup, preequilibrium emission, and coalescence of cascade particles. Initially, we study several variations of the Fermi breakup model and choose the best option for these models. Then, we extend the modified exciton model (MEM)more » used by these codes to account for a possibility of multiple emission of up to 66 types of particles and LF (up to 28Mg) at the preequilibrium stage of reactions. Then, we expand the coalescence model to allow coalescence of LF from nucleons emitted at the intranuclear cascade stage of reactions and from lighter clusters, up to fragments with mass numbers A ≤ 7, in the case of CEM, and A ≤ 12, in the case of LAQGSM. Next, we modify MCNP6 to allow calculating and outputting spectra of LF and heavier products with arbitrary mass and charge numbers. The improved version of CEM is implemented into MCNP6. Lastly, we test the improved versions of CEM, LAQGSM, and MCNP6 on a variety of measured nuclear reactions. The modified codes give an improved description of energetic LF from particle- and nucleus-induced reactions; showing a good agreement with a variety of available experimental data. They have an improved predictive power compared to the previous versions and can be used as reliable tools in simulating applications involving such types of reactions.« less
Bohm, Tim D; DeLuca, Paul M; DeWerd, Larry A
2003-04-01
Permanent implantation of low energy (20-40 keV) photon emitting radioactive seeds to treat prostate cancer is an important treatment option for patients. In order to produce accurate implant brachytherapy treatment plans, the dosimetry of a single source must be well characterized. Monte Carlo based transport calculations can be used for source characterization, but must have up to date cross section libraries to produce accurate dosimetry results. This work benchmarks the MCNP code and its photon cross section library for low energy photon brachytherapy applications. In particular, we calculate the emitted photon spectrum, air kerma, depth dose in water, and radial dose function for both 125I and 103Pd based seeds and compare to other published results. Our results show that MCNP's cross section library differs from recent data primarily in the photoelectric cross section for low energies and low atomic number materials. In water, differences as large as 10% in the photoelectric cross section and 6% in the total cross section occur at 125I and 103Pd photon energies. This leads to differences in the dose rate constant of 3% and 5%, and differences as large as 18% and 20% in the radial dose function for the 125I and 103Pd based seeds, respectively. Using a partially updated photon library, calculations of the dose rate constant and radial dose function agree with other published results. Further, the use of the updated photon library allows us to verify air kerma and depth dose in water calculations performed using MCNP's perturbation feature to simulate updated cross sections. We conclude that in order to most effectively use MCNP for low energy photon brachytherapy applications, we must update its cross section library. Following this update, the MCNP code system will be a very effective tool for low energy photon brachytherapy dosimetry applications.
Chibani, Omar; Li, X Allen
2002-05-01
Three Monte Carlo photon/electron transport codes (GEPTS, EGSnrc, and MCNP) are bench-marked against dose measurements in homogeneous (both low- and high-Z) media as well as at interfaces. A brief overview on physical models used by each code for photon and electron (positron) transport is given. Absolute calorimetric dose measurements for 0.5 and 1 MeV electron beams incident on homogeneous and multilayer media are compared with the predictions of the three codes. Comparison with dose measurements in two-layer media exposed to a 60Co gamma source is also performed. In addition, comparisons between the codes (including the EGS4 code) are done for (a) 0.05 to 10 MeV electron beams and positron point sources in lead, (b) high-energy photons (10 and 20 MeV) irradiating a multilayer phantom (water/steel/air), and (c) simulation of a 90Sr/90Y brachytherapy source. A good agreement is observed between the calorimetric electron dose measurements and predictions of GEPTS and EGSnrc in both homogeneous and multilayer media. MCNP outputs are found to be dependent on the energy-indexing method (Default/ITS style). This dependence is significant in homogeneous media as well as at interfaces. MCNP(ITS) fits more closely the experimental data than MCNP(DEF), except for the case of Be. At low energy (0.05 and 0.1 MeV), MCNP(ITS) dose distributions in lead show higher maximums in comparison with GEPTS and EGSnrc. EGS4 produces too penetrating electron-dose distributions in high-Z media, especially at low energy (<0.1 MeV). For positrons, differences between GEPTS and EGSnrc are observed in lead because GEPTS distinguishes positrons from electrons for both elastic multiple scattering and bremsstrahlung emission models. For the 60Co source, a quite good agreement between calculations and measurements is observed with regards to the experimental uncertainty. For the other cases (10 and 20 MeV photon sources and the 90Sr/90Y beta source), a good agreement is found between the three codes. In conclusion, differences between GEPTS and EGSnrc results are found to be very small for almost all media and energies studied. MCNP results depend significantly on the electron energy-indexing method.
Individualized Human CAD Models: Anthropmetric Morphing and Body Tissue Layering
2014-07-31
Part Flow Chart of the Interaction among VBA Macros, Excel® Spreadsheet, and SolidWorks Front View of the Male and Female Soldier CAD Model...yellow highlighting. The spreadsheet is linked to the CAD model by macros created with the Visual Basic for Application ( VBA ) editor in Microsoft Excel...basically three working parts to the anthropometric morphing that are all interconnected ( VBA macros, Excel spreadsheet, and SolidWorks). The flow
Real-Time 3D Sonar Modeling And Visualization
1998-06-01
looking back towards Manta sonar beam, Manta plus sonar from 1000m off track. 185 NUWC sponsor Erik Chaum Principal investigator Don Brutzman...USN Sonar Officer LT Kevin Byrne USN Intelligence Officer CPT Russell Storms USA Erik Chaum works in NUWC Code 22. He supervised the design and...McGhee, Bob, "The Phoenix Autonomous Underwater Vehicle," chapter 13, AI-BasedMobile Robots, editors David Kortenkamp, Pete Bonasso and Robin Murphy
Arthroscopic Preparation 0f the Posterior and Posteroinferior Glenoid Labrum
2007-11-01
cient traction to easily visual - ize and work in this area of the joint (Figure I). Glenohumeml arthroscopy is initiated from a standard posterior...flexion, and 10 Ibs of traction. the diagnostic glenohumeral arthroscopy is completed, a mid-glenoid (anteroinferior) portal is made just...lorrhaphy. Arthroscopy . 2006; 22:1138 el-5. Section Editor: Steven F. Harwin, MD NOVEMBER 2007 I Volume 30 • Number 11 3. Difelice GS , Williams RJ lil
Vision science and adaptive optics, the state of the field.
Marcos, Susana; Werner, John S; Burns, Stephen A; Merigan, William H; Artal, Pablo; Atchison, David A; Hampson, Karen M; Legras, Richard; Lundstrom, Linda; Yoon, Geungyoung; Carroll, Joseph; Choi, Stacey S; Doble, Nathan; Dubis, Adam M; Dubra, Alfredo; Elsner, Ann; Jonnal, Ravi; Miller, Donald T; Paques, Michel; Smithson, Hannah E; Young, Laura K; Zhang, Yuhua; Campbell, Melanie; Hunter, Jennifer; Metha, Andrew; Palczewska, Grazyna; Schallek, Jesse; Sincich, Lawrence C
2017-03-01
Adaptive optics is a relatively new field, yet it is spreading rapidly and allows new questions to be asked about how the visual system is organized. The editors of this feature issue have posed a series of question to scientists involved in using adaptive optics in vision science. The questions are focused on three main areas. In the first we investigate the use of adaptive optics for psychophysical measurements of visual system function and for improving the optics of the eye. In the second, we look at the applications and impact of adaptive optics on retinal imaging and its promise for basic and applied research. In the third, we explore how adaptive optics is being used to improve our understanding of the neurophysiology of the visual system. Copyright © 2017 Elsevier Ltd. All rights reserved.
Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.
Modification and benchmarking of MCNP for low-energy tungsten spectra.
Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M
2000-12-01
The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.
NASA Astrophysics Data System (ADS)
Tiyapun, K.; Chimtin, M.; Munsorn, S.; Somchit, S.
2015-05-01
The objective of this work is to demonstrate the method for validating the predication of the calculation methods for neutron flux distribution in the irradiation tubes of TRIGA research reactor (TRR-1/M1) using the MCNP computer code model. The reaction rate using in the experiment includes 27Al(n, α)24Na and 197Au(n, γ)198Au reactions. Aluminium (99.9 wt%) and gold (0.1 wt%) foils and the gold foils covered with cadmium were irradiated in 9 locations in the core referred to as CT, C8, C12, F3, F12, F22, F29, G5, and G33. The experimental results were compared to the calculations performed using MCNP which consisted of the detailed geometrical model of the reactor core. The results from the experimental and calculated normalized reaction rates in the reactor core are in good agreement for both reactions showing that the material and geometrical properties of the reactor core are modelled very well. The results indicated that the difference between the experimental measurements and the calculation of the reactor core using the MCNP geometrical model was below 10%. In conclusion the MCNP computational model which was used to calculate the neutron flux and reaction rate distribution in the reactor core can be used for others reactor core parameters including neutron spectra calculation, dose rate calculation, power peaking factors calculation and optimization of research reactor utilization in the future with the confidence in the accuracy and reliability of the calculation.
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei
2011-10-01
High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.
CAD-Based Shielding Analysis for ITER Port Diagnostics
NASA Astrophysics Data System (ADS)
Serikov, Arkady; Fischer, Ulrich; Anthoine, David; Bertalot, Luciano; De Bock, Maartin; O'Connor, Richard; Juarez, Rafael; Krasilnikov, Vitaly
2017-09-01
Radiation shielding analysis conducted in support of design development of the contemporary diagnostic systems integrated inside the ITER ports is relied on the use of CAD models. This paper presents the CAD-based MCNP Monte Carlo radiation transport and activation analyses for the Diagnostic Upper and Equatorial Port Plugs (UPP #3 and EPP #8, #17). The creation process of the complicated 3D MCNP models of the diagnostics systems was substantially accelerated by application of the CAD-to-MCNP converter programs MCAM and McCad. High performance computing resources of the Helios supercomputer allowed to speed-up the MCNP parallel transport calculations with the MPI/OpenMP interface. The found shielding solutions could be universal, reducing ports R&D costs. The shield block behind the Tritium and Deposit Monitor (TDM) optical box was added to study its influence on Shut-Down Dose Rate (SDDR) in Port Interspace (PI) of EPP#17. Influence of neutron streaming along the Lost Alpha Monitor (LAM) on the neutron energy spectra calculated in the Tangential Neutron Spectrometer (TNS) of EPP#8. For the UPP#3 with Charge eXchange Recombination Spectroscopy (CXRS-core), an excessive neutron streaming along the CXRS shutter, which should be prevented in further design iteration.
Criticality Calculations with MCNP6 - Practical Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2016-11-29
These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less
Using Machine Learning to Predict MCNP Bias
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grechanuk, Pavel Aleksandrovi
For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental k eff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles,more » and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.« less
On the effect of updated MCNP photon cross section data on the simulated response of the HPA TLD.
Eakins, Jonathan
2009-02-01
The relative response of the new Health Protection Agency thermoluminescence dosimeter (TLD) has been calculated for Narrow Series X-ray distribution and (137)Cs photon sources using the Monte Carlo code MCNP5, and the results compared with those obtained during its design stage using the predecessor code, MCNP4c2. The results agreed at intermediate energies (approximately 0.1 MeV to (137)Cs), but differed at low energies (<0.1 MeV) by up to approximately 10%. This disparity has been ascribed to differences in the default photon interaction data used by the two codes, and derives ultimately from the effect on absorbed dose of the recent updates to the photoelectric cross sections. The sources of these data have been reviewed.
Benchmark study for charge deposition by high energy electrons in thick slabs
NASA Technical Reports Server (NTRS)
Jun, I.
2002-01-01
The charge deposition profiles created when highenergy (1, 10, and 100 MeV) electrons impinge ona thick slab of elemental aluminum, copper, andtungsten are presented in this paper. The chargedeposition profiles were computed using existing representative Monte Carlo codes: TIGER3.0 (1D module of ITS3.0) and MCNP version 4B. The results showed that TIGER3.0 and MCNP4B agree very well (within 20% of each other) in the majority of the problem geometry. The TIGER results were considered to be accurate based on previous studies. Thus, it was demonstrated that MCNP, with its powerful geometry capability and flexible source and tally options, could be used in calculations of electron charging in high energy electron-rich space radiation environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, William R.; Lee, John C.; baxter, Alan
Information and measured data from the intial Fort St. Vrain (FSV) high temperature gas reactor core is used to develop a benchmark configuration to validate computational methods for analysis of a full-core, commercial HTR configuration. Large uncertainties in the geometry and composition data for the FSV fuel and core are identified, including: (1) the relative numbers of fuel particles for the four particle types, (2) the distribution of fuel kernel diameters for the four particle types, (3) the Th:U ratio in the initial FSV core, (4) and the buffer thickness for the fissile and fertile particles. Sensitivity studies were performedmore » to assess each of these uncertainties. A number of methods were developed to assist in these studies, including: (1) the automation of MCNP5 input files for FSV using Python scripts, (2) a simple method to verify isotopic loadings in MCNP5 input files, (3) an automated procedure to conduct a coupled MCNP5-RELAP5 analysis for a full-core FSV configuration with thermal-hydraulic feedback, and (4) a methodology for sampling kernel diameters from arbitrary power law and Gaussian PDFs that preserved fuel loading and packing factor constraints. A reference FSV fuel configuration was developed based on having a single diameter kernel for each of the four particle types, preserving known uranium and thorium loadings and packing factor (58%). Three fuel models were developed, based on representing the fuel as a mixture of kernels with two diameters, four diameters, or a continuous range of diameters. The fuel particles were put into a fuel compact using either a lattice-bsed approach or a stochastic packing methodology from RPI, and simulated with MCNP5. The results of the sensitivity studies indicated that the uncertainties in the relative numbers and sizes of fissile and fertile kernels were not important nor were the distributions of kernel diameters within their diameter ranges. The uncertainty in the Th:U ratio in the intial FSV core was found to be important with a crude study. The uncertainty in the TRISO buffer thickness was estimated to be unimportant but the study was not conclusive. FSV fuel compacts and a regular FSV fuel element were analyzed with MCNP5 and compared with predictions using a modified version of HELIOS that is capable of analyzing TRISO fuel configurations. The HELIOS analyses were performed by SSP. The eigenvalue discrepancies between HELIOS and MCNP5 are currently on the order of 1% but these are still being evaluated. Full-core FSV configurations were developed for two initial critical configurations - a cold, clean critical loading and a critical configuration at 70% power. MCNP5 predictions are compared to experimental data and the results are mixed. Analyses were also done for the pulsed neutron experiments that were conducted by GA for the initial FSV core. MCNP5 was used to model these experiments and reasonable agreement with measured results has been observed.« less
Considerations of MCNP Monte Carlo code to be used as a radiotherapy treatment planning tool.
Juste, B; Miro, R; Gallardo, S; Verdu, G; Santos, A
2005-01-01
The present work has simulated the photon and electron transport in a Theratron 780® (MDS Nordion)60Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle). This project explains mainly the different methodologies carried out to speedup calculations in order to apply this code efficiently in radiotherapy treatment planning.
Daures, J; Gouriou, J; Bordy, J M
2011-03-01
This work has been performed within the frame of the European Union ORAMED project (Optimisation of RAdiation protection for MEDical staff). The main goal of the project is to improve standards of protection for medical staff for procedures resulting in potentially high exposures and to develop methodologies for better assessing and for reducing, exposures to medical staff. The Work Package WP2 is involved in the development of practical eye-lens dosimetry in interventional radiology. This study is complementary of the part of the ENEA report concerning the calculations with the MCNP-4C code of the conversion factors related to the operational quantity H(p)(3). In this study, a set of energy- and angular-dependent conversion coefficients (H(p)(3)/K(a)), in the newly proposed square cylindrical phantom made of ICRU tissue, have been calculated with the Monte-Carlo code PENELOPE and MCNP5. The H(p)(3) values have been determined in terms of absorbed dose, according to the definition of this quantity, and also with the kerma approximation as formerly reported in ICRU reports. At a low-photon energy (up to 1 MeV), the two results obtained with the two methods are consistent. Nevertheless, large differences are showed at a higher energy. This is mainly due to the lack of electronic equilibrium, especially for small angle incidences. The values of the conversion coefficients obtained with the MCNP-4C code published by ENEA quite agree with the kerma approximation calculations obtained with PENELOPE. We also performed the same calculations with the code MCNP5 with two types of tallies: F6 for kerma approximation and *F8 for estimating the absorbed dose that is, as known, due to secondary electrons. PENELOPE and MCNP5 results agree for the kerma approximation and for the absorbed dose calculation of H(p)(3) and prove that, for photon energies larger than 1 MeV, the transport of the secondary electrons has to be taken into account.
XML-Based Visual Specification of Multidisciplinary Applications
NASA Technical Reports Server (NTRS)
Al-Theneyan, Ahmed; Jakatdar, Amol; Mehrotra, Piyush; Zubair, Mohammad
2001-01-01
The advancements in the Internet and Web technologies have fueled a growing interest in developing a web-based distributed computing environment. We have designed and developed Arcade, a web-based environment for designing, executing, monitoring, and controlling distributed heterogeneous applications, which is easy to use and access, portable, and provides support through all phases of the application development and execution. A major focus of the environment is the specification of heterogeneous, multidisciplinary applications. In this paper we focus on the visual and script-based specification interface of Arcade. The web/browser-based visual interface is designed to be intuitive to use and can also be used for visual monitoring during execution. The script specification is based on XML to: (1) make it portable across different frameworks, and (2) make the development of our tools easier by using the existing freely available XML parsers and editors. There is a one-to-one correspondence between the visual and script-based interfaces allowing users to go back and forth between the two. To support this we have developed translators that translate a script-based specification to a visual-based specification, and vice-versa. These translators are integrated with our tools and are transparent to users.
Radiation shielding quality assurance
NASA Astrophysics Data System (ADS)
Um, Dallsun
For the radiation shielding quality assurance, the validity and reliability of the neutron transport code MCNP, which is now one of the most widely used radiation shielding analysis codes, were checked with lot of benchmark experiments. And also as a practical example, follows were performed in this thesis. One integral neutron transport experiment to measure the effect of neutron streaming in iron and void was performed with Dog-Legged Void Assembly in Knolls Atomic Power Laboratory in 1991. Neutron flux was measured six different places with the methane detectors and a BF-3 detector. The main purpose of the measurements was to provide benchmark against which various neutron transport calculation tools could be compared. Those data were used in verification of Monte Carlo Neutron & Photon Transport Code, MCNP, with the modeling for that. Experimental results and calculation results were compared in both ways, as the total integrated value of neutron fluxes along neutron energy range from 10 KeV to 2 MeV and as the neutron spectrum along with neutron energy range. Both results are well matched with the statistical error +/-20%. MCNP results were also compared with those of TORT, a three dimensional discrete ordinates code which was developed by Oak Ridge National Laboratory. MCNP results are superior to the TORT results at all detector places except one. This means that MCNP is proved as a very powerful tool for the analysis of neutron transport through iron & air and further it could be used as a powerful tool for the radiation shielding analysis. For one application of the analysis of variance (ANOVA) to neutron and gamma transport problems, uncertainties for the calculated values of critical K were evaluated as in the ANOVA on statistical data.
Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.
Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A
2004-02-07
The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.
Monte Carlo N-particle simulation of neutron-based sterilisation of anthrax contamination
Liu, B; Xu, J; Liu, T; Ouyang, X
2012-01-01
Objective To simulate the neutron-based sterilisation of anthrax contamination by Monte Carlo N-particle (MCNP) 4C code. Methods Neutrons are elementary particles that have no charge. They are 20 times more effective than electrons or γ-rays in killing anthrax spores on surfaces and inside closed containers. Neutrons emitted from a 252Cf neutron source are in the 100 keV to 2 MeV energy range. A 2.5 MeV D–D neutron generator can create neutrons at up to 1013 n s−1 with current technology. All these enable an effective and low-cost method of killing anthrax spores. Results There is no effect on neutron energy deposition on the anthrax sample when using a reflector that is thicker than its saturation thickness. Among all three reflecting materials tested in the MCNP simulation, paraffin is the best because it has the thinnest saturation thickness and is easy to machine. The MCNP radiation dose and fluence simulation calculation also showed that the MCNP-simulated neutron fluence that is needed to kill the anthrax spores agrees with previous analytical estimations very well. Conclusion The MCNP simulation indicates that a 10 min neutron irradiation from a 0.5 g 252Cf neutron source or a 1 min neutron irradiation from a 2.5 MeV D–D neutron generator may kill all anthrax spores in a sample. This is a promising result because a 2.5 MeV D–D neutron generator output >1013 n s−1 should be attainable in the near future. This indicates that we could use a D–D neutron generator to sterilise anthrax contamination within several seconds. PMID:22573293
Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT
Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.
2011-01-01
Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896
Monte Carlo N-particle simulation of neutron-based sterilisation of anthrax contamination.
Liu, B; Xu, J; Liu, T; Ouyang, X
2012-10-01
To simulate the neutron-based sterilisation of anthrax contamination by Monte Carlo N-particle (MCNP) 4C code. Neutrons are elementary particles that have no charge. They are 20 times more effective than electrons or γ-rays in killing anthrax spores on surfaces and inside closed containers. Neutrons emitted from a (252)Cf neutron source are in the 100 keV to 2 MeV energy range. A 2.5 MeV D-D neutron generator can create neutrons at up to 10(13) n s(-1) with current technology. All these enable an effective and low-cost method of killing anthrax spores. There is no effect on neutron energy deposition on the anthrax sample when using a reflector that is thicker than its saturation thickness. Among all three reflecting materials tested in the MCNP simulation, paraffin is the best because it has the thinnest saturation thickness and is easy to machine. The MCNP radiation dose and fluence simulation calculation also showed that the MCNP-simulated neutron fluence that is needed to kill the anthrax spores agrees with previous analytical estimations very well. The MCNP simulation indicates that a 10 min neutron irradiation from a 0.5 g (252)Cf neutron source or a 1 min neutron irradiation from a 2.5 MeV D-D neutron generator may kill all anthrax spores in a sample. This is a promising result because a 2.5 MeV D-D neutron generator output >10(13) n s(-1) should be attainable in the near future. This indicates that we could use a D-D neutron generator to sterilise anthrax contamination within several seconds.
3D visualization of molecular structures in the MOGADOC database
NASA Astrophysics Data System (ADS)
Vogt, Natalja; Popov, Evgeny; Rudert, Rainer; Kramer, Rüdiger; Vogt, Jürgen
2010-08-01
The MOGADOC database (Molecular Gas-Phase Documentation) is a powerful tool to retrieve information about compounds which have been studied in the gas-phase by electron diffraction, microwave spectroscopy and molecular radio astronomy. Presently the database contains over 34,500 bibliographic references (from the beginning of each method) for about 10,000 inorganic, organic and organometallic compounds and structural data (bond lengths, bond angles, dihedral angles, etc.) for about 7800 compounds. Most of the implemented molecular structures are given in a three-dimensional (3D) presentation. To create or edit and visualize the 3D images of molecules, new tools (special editor and Java-based 3D applet) were developed. Molecular structures in internal coordinates were converted to those in Cartesian coordinates.
Using the MCNP Taylor series perturbation feature (efficiently) for shielding problems
NASA Astrophysics Data System (ADS)
Favorite, Jeffrey
2017-09-01
The Taylor series or differential operator perturbation method, implemented in MCNP and invoked using the PERT card, can be used for efficient parameter studies in shielding problems. This paper shows how only two PERT cards are needed to generate an entire parameter study, including statistical uncertainty estimates (an additional three PERT cards can be used to give exact statistical uncertainties). One realistic example problem involves a detailed helium-3 neutron detector model and its efficiency as a function of the density of its high-density polyethylene moderator. The MCNP differential operator perturbation capability is extremely accurate for this problem. A second problem involves the density of the polyethylene reflector of the BeRP ball and is an example of first-order sensitivity analysis using the PERT capability. A third problem is an analytic verification of the PERT capability.
NASA Technical Reports Server (NTRS)
1995-01-01
The Interactive Data Language (IDL), developed by Research Systems, Inc., is a tool for scientists to investigate their data without having to write a custom program for each study. IDL is based on the Mariners Mars spectral Editor (MMED) developed for studies from NASA's Mars spacecraft flights. The company has also developed Environment for Visualizing Images (ENVI), an image processing system for easily analyzing remotely sensed data written in IDL. The Visible Human CD, another Research Systems product, is the first complete digital reference of photographic images for exploring human anatomy.
Prompt Radiation Protection Factors
2018-02-01
dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat
An investigation of MCNP6.1 beryllium oxide S(α, β) cross sections
Sartor, Raymond F.; Glazener, Natasha N.
2016-03-08
In MCNP6.1, materials are constructed by identifying the constituent isotopes (or elements in a few cases) individually. This list selects the corresponding microscopic cross sections calculated from the free-gas model to create the material macroscopic cross sections. Furthermore, the free-gas model and the corresponding material macroscopic cross sections assume that the interactions of atoms do not affect the nuclear cross sections.
Determination of neutron flux distribution in an Am-Be irradiator using the MCNP.
Shtejer-Diaz, K; Zamboni, C B; Zahn, G S; Zevallos-Chávez, J Y
2003-10-01
A neutron irradiator has been assembled at IPEN facilities to perform qualitative-quantitative analysis of many materials using thermal and fast neutrons outside the nuclear reactor premises. To establish the prototype specifications, the neutron flux distribution and the absorbed dose rates were calculated using the MCNP computer code. These theoretical predictions then allow one to discuss the optimum irradiator design and its performance.
SABRINA: an interactive three-dimensional geometry-mnodeling program for MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T. III
SABRINA is a fully interactive three-dimensional geometry-modeling program for MCNP, a Los Alamos Monte Carlo code for neutron and photon transport. In SABRINA, a user constructs either body geometry or surface geometry models and debugs spatial descriptions for the resulting objects. This enhanced capability significantly reduces effort in constructing and debugging complicated three-dimensional geometry models for Monte Carlo analysis. 2 refs., 33 figs.
Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6.
Ratliff, Hunter N; Smith, Michael B R; Heilbronn, Lawrence
2017-08-01
The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 µGy/day while RAD measured 233 µGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 µSv/day while RAD reported 710 µSv/day. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.
Characterization of Filters Loaded With Reactor Strontium Carbonate - 13203
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josephson, Walter S.; Steen, Franciska H.
A collection of three highly radioactive filters containing reactor strontium carbonate were being prepared for disposal. All three filters were approximately characterized at the time of manufacture by gravimetric methods. The first filter had been partially emptied, and the quantity of residual activity was uncertain. Dose rate to activity modeling using the Monte-Carlo N Particle (MCNP) code was selected to confirm the gravimetric characterization of the full filters, and to fully characterize the partially emptied filter. Although dose rate to activity modeling using MCNP is a common technique, it is not often used for Bremsstrahlung-dominant materials such as reactor strontium.more » As a result, different MCNP modeling options were compared to determine the optimum approach. This comparison indicated that the accuracy of the results were heavily dependent on the MCNP modeling details and the location of the dose rate measurement point. The optimum model utilized a photon spectrum generated by the Oak Ridge Isotope Generation and Depletion (ORIGEN) code and dose rates measured at 30 cm. Results from the optimum model agreed with the gravimetric estimates within 15%. It was demonstrated that dose rate to activity modeling can be successful for Bremsstrahlung-dominant radioactive materials. However, the degree of success is heavily dependent on the choice of modeling techniques. (authors)« less
Evaluation of the Pool Critical Assembly Benchmark with Explicitly-Modeled Geometry using MCNP6
Kulesza, Joel A.; Martz, Roger Lee
2017-03-01
Despite being one of the most widely used benchmarks for qualifying light water reactor (LWR) radiation transport methods and data, no benchmark calculation of the Oak Ridge National Laboratory (ORNL) Pool Critical Assembly (PCA) pressure vessel wall benchmark facility (PVWBF) using MCNP6 with explicitly modeled core geometry exists. As such, this paper provides results for such an analysis. First, a criticality calculation is used to construct the fixed source term. Next, ADVANTG-generated variance reduction parameters are used within the final MCNP6 fixed source calculations. These calculations provide unadjusted dosimetry results using three sets of dosimetry reaction cross sections of varyingmore » ages (those packaged with MCNP6, from the IRDF-2002 multi-group library, and from the ACE-formatted IRDFF v1.05 library). These results are then compared to two different sets of measured reaction rates. The comparison agrees in an overall sense within 2% and on a specific reaction- and dosimetry location-basis within 5%. Except for the neptunium dosimetry, the individual foil raw calculation-to-experiment comparisons usually agree within 10% but is typically greater than unity. Finally, in the course of developing these calculations, geometry that has previously not been completely specified is provided herein for the convenience of future analysts.« less
Validation of MCNP: SPERT-D and BORAX-V fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, C.; Palmer, B.
1992-11-01
This report discusses critical experiments involving SPERT-D{sup 1,2} fuel elements and BORAX-V{sup 3-8} fuel which have been modeled and calculations performed with MCNP. MCNP is a Monte Carlo based transport code. For this study continuous-energy nuclear data from the ENDF/B-V cross section library was used. The SPERT-D experiments consisted of various arrays of fuel elements moderated and reflected with either water or a uranyl nitrate solution. Some SPERT-D experiments used cadmium as a fixed neutron poison, while others were poisoned with various concentrations of boron in the moderating/reflecting solution. ne BORAX-V experiments were arrays of either boiling fuel rod assembliesmore » or superheater assemblies, both types of arrays were moderated and reflected with water. In one boiling fuel experiment, two fuel rods were replaced with borated stainless steel poison rods.« less
Validation of MCNP: SPERT-D and BORAX-V fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, C.; Palmer, B.
1992-11-01
This report discusses critical experiments involving SPERT-D[sup 1,2] fuel elements and BORAX-V[sup 3-8] fuel which have been modeled and calculations performed with MCNP. MCNP is a Monte Carlo based transport code. For this study continuous-energy nuclear data from the ENDF/B-V cross section library was used. The SPERT-D experiments consisted of various arrays of fuel elements moderated and reflected with either water or a uranyl nitrate solution. Some SPERT-D experiments used cadmium as a fixed neutron poison, while others were poisoned with various concentrations of boron in the moderating/reflecting solution. ne BORAX-V experiments were arrays of either boiling fuel rod assembliesmore » or superheater assemblies, both types of arrays were moderated and reflected with water. In one boiling fuel experiment, two fuel rods were replaced with borated stainless steel poison rods.« less
Absorbed fractions in a voxel-based phantom calculated with the MCNP-4B code.
Yoriyaz, H; dos Santos, A; Stabin, M G; Cabezas, R
2000-07-01
A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. MCNP-4B absorbed fractions for photons in the mathematical phantom of Snyder et al. agreed well with reference values. Results obtained through radiation transport simulation in the voxel-based phantom, in general, agreed well with reference values. Considerable discrepancies, however, were found in some cases due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the voxel-based phantom, which is not considered in the mathematical phantom.
2015-01-01
Mitochondria-targeting peptides have garnered immense interest as potential chemotherapeutics in recent years. However, there is a clear need to develop strategies to overcome the critical limitations of peptides, such as poor solubility and the lack of target specificity, which impede their clinical applications. To this end, we report magnetic core–shell nanoparticle (MCNP)-mediated delivery of a mitochondria-targeting pro-apoptotic amphipathic tail-anchoring peptide (ATAP) to malignant brain and metastatic breast cancer cells. Conjugation of ATAP to the MCNPs significantly enhanced the chemotherapeutic efficacy of ATAP, while the presence of targeting ligands afforded selective delivery to cancer cells. Induction of MCNP-mediated hyperthermia further potentiated the efficacy of ATAP. In summary, a combination of MCNP-mediated ATAP delivery and subsequent hyperthermia resulted in an enhanced effect on mitochondrial dysfunction, thus resulting in increased cancer cell apoptosis. PMID:25133971
MCNP simulation to optimise in-pile and shielding parts of the Portuguese SANS instrument.
Gonçalves, I F; Salgado, J; Falcão, A; Margaça, F M A; Carvalho, F G
2005-01-01
A Small Angle Neutron Scattering instrument is being installed at one end of the tangential beam tube of the Portuguese Research Reactor. The instrument is fed using a neutron scatterer positioned in the middle of the beam tube. The scatterer consists of circulating H2O contained in a hollow disc of Al. The in-pile shielding components and the shielding installed around the neutron selector have been the object of an MCNP simulation study. The quantities calculated were the neutron and gamma-ray fluxes in different positions, the energy deposited in the material by the neutron and gamma-ray fields, the material activation resulting from the neutron field and radiation doses at the exit wall of the shutter and around the shielding. The MCNP results are presented and compared with results of an analytical approach and with experimental data collected after installation.
Wangerin, K; Culbertson, C N; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for gadolinium neutron capture therapy (GdNCT) related modeling. The validity of COG NCT model has been established for this model, and here the calculation was extended to analyze the effect of various gadolinium concentrations on dose distribution and cell-kill effect of the GdNCT modality and to determine the optimum therapeutic conditions for treating brain cancers. The computational results were compared with the widely used MCNP code. The differences between the COG and MCNP predictions were generally small and suggest that the COG code can be applied to similar research problems in NCT. Results for this study also showed that a concentration of 100 ppm gadolinium in the tumor was most beneficial when using an epithermal neutron beam.
He, Zilong; Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Chen, Wei-Hua; Hu, Songnian
2016-07-08
Evolview is an online visualization and management tool for customized and annotated phylogenetic trees. It allows users to visualize phylogenetic trees in various formats, customize the trees through built-in functions and user-supplied datasets and export the customization results to publication-ready figures. Its 'dataset system' contains not only the data to be visualized on the tree, but also 'modifiers' that control various aspects of the graphical annotation. Evolview is a single-page application (like Gmail); its carefully designed interface allows users to upload, visualize, manipulate and manage trees and datasets all in a single webpage. Developments since the last public release include a modern dataset editor with keyword highlighting functionality, seven newly added types of annotation datasets, collaboration support that allows users to share their trees and datasets and various improvements of the web interface and performance. In addition, we included eleven new 'Demo' trees to demonstrate the basic functionalities of Evolview, and five new 'Showcase' trees inspired by publications to showcase the power of Evolview in producing publication-ready figures. Evolview is freely available at: http://www.evolgenius.info/evolview/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Pedrami, Farnoush; Asenso, Pamela; Devi, Sachin
2016-08-25
Objective. To identify trends in pharmacy education during last two decades using text mining. Methods. Articles published in the American Journal of Pharmaceutical Education (AJPE) in the past two decades were compiled in a database. Custom text analytics software was written using Visual Basic programming language in the Visual Basic for Applications (VBA) editor of Excel 2007. Frequency of words appearing in article titles was calculated using the custom VBA software. Data were analyzed to identify the emerging trends in pharmacy education. Results. Three educational trends emerged: active learning, interprofessional, and cultural competency. Conclusion. The text analytics program successfully identified trends in article topics and may be a useful compass to predict the future course of pharmacy education.
MCNP/X TRANSPORT IN THE TABULAR REGIME
DOE Office of Scientific and Technical Information (OSTI.GOV)
HUGHES, H. GRADY
2007-01-08
The authors review the transport capabilities of the MCNP and MCNPX Monte Carlo codes in the energy regimes in which tabular transport data are available. Giving special attention to neutron tables, they emphasize the measures taken to improve the treatment of a variety of difficult aspects of the transport problem, including unresolved resonances, thermal issues, and the availability of suitable cross sections sets. They also briefly touch on the current situation in regard to photon, electron, and proton transport tables.
MCNP modelling of scintillation-detector gamma-ray spectra from natural radionuclides.
Hendriks, P H G M; Maucec, M; de Meijer, R J
2002-09-01
gamma-ray spectra of natural radionuclides are simulated for a BGO detector in a borehole geometry using the Monte Carlo code MCNP. All gamma-ray emissions of the decay of 40K and the series of 232Th and 238U are used to describe the source. A procedure is proposed which excludes the time-consuming electron tracking in less relevant areas of the geometry. The simulated gamma-ray spectra are benchmarked against laboratory data.
A Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software package, known as MINIVER, is a powerful yet lightweight tool used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with the Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the pre-processor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
An Upgrade of the Aeroheating Software "MINIVER"
NASA Technical Reports Server (NTRS)
Louderback, Pierce M.
2013-01-01
Many software packages assist engineers with performing flight vehicle analysis, but some of these packages have gone many years without updates or significant improvements to their workflows. One such software, known as MINIVER, is a powerful yet lightweight tool that is used for aeroheating analyses. However, it is an aging program that has not seen major improvements within the past decade. As part of a collaborative effort with Florida Institute of Technology, MINIVER has received a major user interface overhaul, a change in program language, and will be continually receiving updates to improve its capabilities. The user interface update includes a migration from a command-line interface to that of a graphical user interface supported in the Windows operating system. The organizational structure of the preprocessor has been transformed to clearly defined categories to provide ease of data entry. Helpful tools have been incorporated, including the ability to copy sections of cases as well as a generalized importer which aids in bulk data entry. A visual trajectory editor has been included, as well as a CAD Editor which allows the user to input simplified geometries in order to generate MINIVER cases in bulk. To demonstrate its continued effectiveness, a case involving the JAXA OREX flight vehicle will be included, providing comparisons to captured flight data as well as other computational solutions. The most recent upgrade effort incorporated the use of the CAD Editor, and current efforts are investigating methods to link MINIVER projects with SINDA/Fluint and Thermal Desktop.
AGU Publications Volunteers Feted At Elegant Editors' Evening
NASA Astrophysics Data System (ADS)
Panning, Jeanette
2013-01-01
The 2012 Fall Meeting Editors' Evening, held at the City Club of San Francisco, was hosted by the Publications Committee and is the premier social event for editors and associate editors attending the Fall Meeting. The evening commenced with a welcome from Carol Finn, incoming AGU president, in which she expressed her thanks to the editors and associate editors for volunteering their time to benefit AGU.
Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.
Yoriyaz, H; Stabin, M G; dos Santos, A
2001-04-01
This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, T.D. Jr.
1996-05-01
The Monte Carlo Model System (MCMS) for the Washington State University (WSU) Radiation Center provides a means through which core criticality and power distributions can be calculated, as well as providing a method for neutron and photon transport necessary for BNCT epithermal neutron beam design. The computational code used in this Model System is MCNP4A. The geometric capability of this Monte Carlo code allows the WSU system to be modeled very accurately. A working knowledge of the MCNP4A neutron transport code increases the flexibility of the Model System and is recommended, however, the eigenvalue/power density problems can be run withmore » little direct knowledge of MCNP4A. Neutron and photon particle transport require more experience with the MCNP4A code. The Model System consists of two coupled subsystems; the Core Analysis and Source Plane Generator Model (CASP), and the BeamPort Shell Particle Transport Model (BSPT). The CASP Model incorporates the S({alpha}, {beta}) thermal treatment, and is run as a criticality problem yielding, the system eigenvalue (k{sub eff}), the core power distribution, and an implicit surface source for subsequent particle transport in the BSPT Model. The BSPT Model uses the source plane generated by a CASP run to transport particles through the thermal column beamport. The user can create filter arrangements in the beamport and then calculate characteristics necessary for assessing the BNCT potential of the given filter want. Examples of the characteristics to be calculated are: neutron fluxes, neutron currents, fast neutron KERMAs and gamma KERMAs. The MCMS is a useful tool for the WSU system. Those unfamiliar with the MCNP4A code can use the MCMS transparently for core analysis, while more experienced users will find the particle transport capabilities very powerful for BNCT filter design.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seung Jun; Buechler, Cynthia Eileen
The current study aims to predict the steady state power of a generic solution vessel and to develop a corresponding heat transfer coefficient correlation for a Moly99 production facility by conducting a fully coupled multi-physics simulation. A prediction of steady state power for the current application is inherently interconnected between thermal hydraulic characteristics (i.e. Multiphase computational fluid dynamics solved by ANSYS-Fluent 17.2) and the corresponding neutronic behavior (i.e. particle transport solved by MCNP6.2) in the solution vessel. Thus, the development of a coupling methodology is vital to understand the system behavior at a variety of system design and postulated operatingmore » scenarios. In this study, we report on the k-effective (keff) calculation for the baseline solution vessel configuration with a selected solution concentration using MCNP K-code modeling. The associated correlation of thermal properties (e.g. density, viscosity, thermal conductivity, specific heat) at the selected solution concentration are developed based on existing experimental measurements in the open literature. The numerical coupling methodology between multiphase CFD and MCNP is successfully demonstrated, and the detailed coupling procedure is documented. In addition, improved coupling methods capturing realistic physics in the solution vessel thermal-neutronic dynamics are proposed and tested further (i.e. dynamic height adjustment, mull-cell approach). As a key outcome of the current study, a multi-physics coupling methodology between MCFD and MCNP is demonstrated and tested for four different operating conditions. Those different operating conditions are determined based on the neutron source strength at a fixed geometry condition. The steady state powers for the generic solution vessel at various operating conditions are reported, and a generalized correlation of the heat transfer coefficient for the current application is discussed. The assessment of multi-physics methodology and preliminary results from various coupled calculations (power prediction and heat transfer coefficient) can be further utilized for the system code validation and generic solution vessel design improvement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adam, D; Bednarz, B
Purpose: The proton boron fusion reaction is a reaction that describes the creation of three alpha particles as the result of the interaction of a proton incident upon a 11B target. Theoretically, the proton boron fusion reaction is a desirable reaction for radiation therapy applications in that, with the appropriate boron delivery agent, it could potentially combine the localized dose delivery protons exhibit (Bragg peak) and the local deposition of high LET alpha particles in cancerous sites. Previous efforts have shown significant dose enhancement using the proton boron fusion reaction; the overarching purpose of this work is an attempt tomore » validate previous Monte Carlo results of the proton boron fusion reaction. Methods: The proton boron fusion reaction, 11B(p, 3α), is investigated using MCNP6 to assess the viability for potential use in radiation therapy. Simple simulations of a proton pencil beam incident upon both a water phantom and a water phantom with an axial region containing 100ppm boron were modeled using MCNP6 in order to determine the extent of the impact boron had upon the calculated energy deposition. Results: The maximum dose increase calculated was 0.026% for the incident 250 MeV proton beam scenario. The MCNP simulations performed demonstrated that the proton boron fusion reaction rate at clinically relevant boron concentrations was too small in order to have any measurable impact on the absorbed dose. Conclusion: For all MCNP6 simulations conducted, the increase of absorbed dose of a simple water phantom due to the 11B(p, 3α) reaction was found to be inconsequential. In addition, it was determined that there are no good evaluations of the 11B(p, 3α) reaction for use in MCNPX/6 and further work should be conducted in cross section evaluations in order to definitively evaluate the feasibility of the proton boron fusion reaction for use in radiation therapy applications.« less
Calculation of self–shielding factor for neutron activation experiments using GEANT4 and MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero–Barrientos, Jaime, E-mail: jaromero@ing.uchile.cl; Universidad de Chile, DFI, Facultad de Ciencias Físicas Y Matemáticas, Avenida Blanco Encalada 2008, Santiago; Molina, F.
2016-07-07
The neutron self–shielding factor G as a function of the neutron energy was obtained for 14 pure metallic samples in 1000 isolethargic energy bins from 1·10{sup −5}eV to 2·10{sup 7}eV using Monte Carlo simulations in GEANT4 and MCNP6. The comparison of these two Monte Carlo codes shows small differences in the final self–shielding factor mostly due to the different cross section databases that each program uses.
MCNP simulations of material exposure experiments (u)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Temple, Brian A
2010-12-08
Simulations of proposed material exposure experiments were performed using MCNP6. The experiments will expose ampules containing different materials of interest with radiation to observe the chemical breakdown of the materials. Simulations were performed to map out dose in materials as a function of distance from the source, dose variation between materials, dose variation due to ampule orientation, and dose variation due to different source energy. This write up is an overview of the simulations and will provide guidance on how to use the data in the spreadsheet.
BMC Ecology image competition: the winning images
2013-01-01
BMC Ecology announces the winning entries in its inaugural Ecology Image Competition, open to anyone affiliated with a research institute. The competition, which received more than 200 entries from international researchers at all career levels and a wide variety of scientific disciplines, was looking for striking visual interpretations of ecological processes. In this Editorial, our academic Section Editors and guest judge Dr Yan Wong explain what they found most appealing about their chosen winning entries, and highlight a few of the outstanding images that didn’t quite make it to the top prize. PMID:23517630
BMC Ecology image competition: the winning images.
Harold, Simon; Wong, Yan; Baguette, Michel; Bonsall, Michael B; Clobert, Jean; Royle, Nick J; Settele, Josef
2013-03-22
BMC Ecology announces the winning entries in its inaugural Ecology Image Competition, open to anyone affiliated with a research institute. The competition, which received more than 200 entries from international researchers at all career levels and a wide variety of scientific disciplines, was looking for striking visual interpretations of ecological processes. In this Editorial, our academic Section Editors and guest judge Dr Yan Wong explain what they found most appealing about their chosen winning entries, and highlight a few of the outstanding images that didn't quite make it to the top prize.
Total reaction cross sections in CEM and MCNP6 at intermediate energies
Kerby, Leslie M.; Mashnik, Stepan G.
2015-05-14
Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less
Richard, Joshua; Galloway, Jack; Fensin, Michael; ...
2015-04-04
A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less
MCNP modelling of the wall effects observed in tissue-equivalent proportional counters.
Hoff, J L; Townsend, L W
2002-01-01
Tissue-equivalent proportional counters (TEPCs) utilise tissue-equivalent materials to depict homogeneous microscopic volumes of human tissue. Although both the walls and gas simulate the same medium, they respond to radiation differently. Density differences between the two materials cause distortions, or wall effects, in measurements, with the most dominant effect caused by delta rays. This study uses a Monte Carlo transport code, MCNP, to simulate the transport of secondary electrons within a TEPC. The Rudd model, a singly differential cross section with no dependence on electron direction, is used to describe the energy spectrum obtained by the impact of two iron beams on water. Based on the models used in this study, a wall-less TEPC had a higher lineal energy (keV.micron-1) as a function of impact parameter than a solid-wall TEPC for the iron beams under consideration. An important conclusion of this study is that MCNP has the ability to model the wall effects observed in TEPCs.
Total reaction cross sections in CEM and MCNP6 at intermediate energies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerby, Leslie M.; Mashnik, Stepan G.
Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less
Conflicts of interest of editors of medical journals
Minhajuddin, Abu
2018-01-01
Background Almost all medical journals now require authors to publicly disclose conflicts of interests (COI). The same standard and scrutiny is rarely employed for the editors of the journals although COI may affect editorial decisions. Methods We conducted a retrospective observational study to determine the prevalence and magnitude of financial relationships among editors of 60 influential US medical journals (10 each for internal medicine and five subspecialties: cardiology, gastroenterology, neurology, dermatology and allergy & immunology). Open Payments database was reviewed to determine the percentage of physician editors receiving payments and the nature and amount of these payments. Findings 703 unique physician editors were included in our analysis. 320/703 (46%) received 8659 general payments totaling $8,120,562. The median number of payments per editor was 9 (IQR 3–26) and the median amount per payment was $91 (IQR $21–441). The median total payment received by each editor in one year was $4,364 (IQR $319–23,143). 152 (48%) editors received payments more than $5,000 in a year, a threshold considered significant by the National Institutes of Health. COI policies for editors were available for 34/60 (57%) journals but only 7/34 (21%) publicly reported the disclosures and only 2 (3.%) reported the dollar amount received. Interpretation A significant number of editors of internal medicine and subspecialty medical journals have financial COI and very few are publicly disclosed. Specialty journal editors have more COI compared to general medicine journal editors. Current policies for disclosing COI for editors are inconsistent and do not comply with the recommended standards. PMID:29775468
New Editors Appointed for Sections of Journal of Geophysical Research
NASA Astrophysics Data System (ADS)
2009-04-01
New editors have been appointed for the Atmospheres, Biogeosciences, and Oceans sections of the Journal of Geophysical Research (JGR). Joost de Gouw (NOAA, Boulder, Colo.) and Renyi Zhang (Texas A&M, College Station) are filling the vacancies of retiring Atmospheres section editors John Austin and Jose Fuentes. De Gouw and Zhang join the continuing editors Steven Ghan and Yinon Rudich. Sara Pryor (Indiana University, Bloomington) is joining the Atmospheres section editorial board as an associate editor now; she will transition to editor in January 2010.
An analysis of MCNP cross-sections and tally methods for low-energy photon emitters.
Demarco, John J; Wallace, Robert E; Boedeker, Kirsten
2002-04-21
Monte Carlo calculations are frequently used to analyse a variety of radiological science applications using low-energy (10-1000 keV) photon sources. This study seeks to create a low-energy benchmark for the MCNP Monte Carlo code by simulating the absolute dose rate in water and the air-kerma rate for monoenergetic point sources with energies between 10 keV and 1 MeV. The analysis compares four cross-section datasets as well as the tally method for collision kerma versus absorbed dose. The total photon attenuation coefficient cross-section for low atomic number elements has changed significantly as cross-section data have changed between 1967 and 1989. Differences of up to 10% are observed in the photoelectric cross-section for water at 30 keV between the standard MCNP cross-section dataset (DLC-200) and the most recent XCOM/NIST tabulation. At 30 keV, the absolute dose rate in water at 1.0 cm from the source increases by 7.8% after replacing the DLC-200 photoelectric cross-sections for water with those from the XCOM/NIST tabulation. The differences in the absolute dose rate are analysed when calculated with either the MCNP absorbed dose tally or the collision kerma tally. Significant differences between the collision kerma tally and the absorbed dose tally can occur when using the DLC-200 attenuation coefficients in conjunction with a modern tabulation of mass energy-absorption coefficients.
MC2-3 / DIF3D Analysis for the ZPPR-15 Doppler and Sodium Void Worth Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Micheal A.; Lell, Richard M.; Lee, Changho
This manuscript covers validation efforts for our deterministic codes at Argonne National Laboratory. The experimental results come from the ZPPR-15 work in 1985-1986 which was focused on the accuracy of physics data for the integral fast reactor concept. Results for six loadings are studied in this document and focus on Doppler sample worths and sodium void worths. The ZPPR-15 loadings are modeled using the MC2-3/DIF3D codes developed and maintained at ANL and the MCNP code from LANL. The deterministic models are generated by processing the as-built geometry information, i.e. MCNP input, and generating MC2-3 cross section generation instructions and amore » drawer homogenized equivalence problem. The Doppler reactivity worth measurements are small heated samples which insert very small amounts of reactivity into the system (< 2 pcm). The results generated by the MC2-3/DIF3D codes were excellent for ZPPR-15A and ZPPR-15B and good for ZPPR-15D, compared to the MCNP solutions. In all cases, notable improvements were made over the analysis techniques applied to the same problems in 1987. The sodium void worths from MC2-3/DIF3D were quite good at 37.5 pcm while MCNP result was 33 pcm and the measured result was 31.5 pcm. Copyright © (2015) by the American Nuclear Society All rights reserved.« less
Journal Editors Celebrated at Editors' Evening
NASA Astrophysics Data System (ADS)
Panning, Jeanette
2014-02-01
At the Fall Meeting, the premiere social event for AGU's many journal editors is the annual Editors' Evening, an opportunity for members to celebrate and to recognize the efforts of retiring editors. At the event, AGU president Carol Finn welcomed all those in attendance and thanked them for volunteering their time for the benefit of AGU and the wider research community.
Examining Editor-Author Ethics: Real-World Scenarios from Interviews with Three Journal Editors
ERIC Educational Resources Information Center
Amare, Nicole; Manning, Alan
2009-01-01
Those who submit manuscripts to academic journals may benefit from a better understanding of how editors weigh ethics in their interactions with authors. In an attempt to ascertain and to understand editors' ethics, we interviewed 3 current academic journal editors of technical and/or business communication journals. We asked them about the…
In Internet-Based Visualization System Study about Breakthrough Applet Security Restrictions
NASA Astrophysics Data System (ADS)
Chen, Jie; Huang, Yan
In the process of realization Internet-based visualization system of the protein molecules, system needs to allow users to use the system to observe the molecular structure of the local computer, that is, customers can generate the three-dimensional graphics from PDB file on the client computer. This requires Applet access to local file, related to the Applet security restrictions question. In this paper include two realization methods: 1.Use such as signature tools, key management tools and Policy Editor tools provided by the JDK to digital signature and authentication for Java Applet, breakthrough certain security restrictions in the browser. 2. Through the use of Servlet agent implement indirect access data methods, breakthrough the traditional Java Virtual Machine sandbox model restriction of Applet ability. The two ways can break through the Applet's security restrictions, but each has its own strengths.
Blindness and the age of enlightenment: Diderot's letter on the blind.
Margo, Curtis E; Harman, Lynn E; Smith, Don B
2013-01-01
Several months after anonymously publishing an essay in 1749 with the title "Letter on the Blind for the Use of Those Who Can See," the chief editor of the French Encyclopédie was arrested and taken to the prison fortress of Vincennes just east of Paris, France. The correctly assumed author, Denis Diderot, was 35 years old and had not yet left his imprint on the Age of Enlightenment. His letter, which recounted the life of Nicolas Saunderson, a blind mathematician, was intended to advance secular empiricism and disparage the religiously tinged rationalism put forward by Rene Descartes. The letter's discussion of sensory perception in men born blind dismissed the supposed primacy of visual imagery in abstract thinking. The essay did little to resolve any philosophical controversy, but it marked a turning point in Western attitudes toward visual disability.
Computer vision for RGB-D sensors: Kinect and its applications.
Shao, Ling; Han, Jungong; Xu, Dong; Shotton, Jamie
2013-10-01
Kinect sensor, high-resolution depth and visual (RGB) sensing has become available for widespread use as an off-the-shelf technology. This special issue is specifically dedicated to new algorithms and/or new applications based on the Kinect (or similar RGB-D) sensors. In total, we received over ninety submissions from more than twenty countries all around the world. The submissions cover a wide range of areas including object and scene classification, 3-D pose estimation, visual tracking, data fusion, human action/activity recognition, 3-D reconstruction, mobile robotics, and so on. After two rounds of review by at least two (mostly three) expert reviewers for each paper, the Guest Editors have selected twelve high-quality papers to be included in this highly popular special issue. The papers that comprise this issue are briefly summarized.
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei
2015-06-01
The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7.8-16.5% below 120 kVp X-ray beams. In this study, we were especially interested in BNCT doses where low energy photon contribution is less to ignore, MCNP model is recognized as the most suitable to simulate wide photon-electron and neutron energy distributed responses of the paired ICs. Also, MCNP provides the best prediction of BNCT source adjustment by the detector's neutron and photon responses.
Neutron Spectrum Measurements from Irradiations at NCERC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackman, Kevin Richard; Mosby, Michelle A.; Bredeweg, Todd Allen
2015-04-15
Several irradiations have been conducted on assemblies (COMET/ZEUS and Flattop) at the National Criticality Experiments Research Center (NCERC) located at the Nevada National Security Site (NNSS). Configurations of the assemblies and irradiated materials changed between experiments. Different metallic foils were analyzed using the radioactivation method by gamma-ray spectrometry to understand/characterize the neutron spectra. Results of MCNP calculations are shown. It was concluded that MCNP simulated spectra agree with experimental measurements, with the caveats that some data are limited by statistics at low-energies and some activation foils have low activities.
Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G
2006-01-01
The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.
Forbes, Jessica L.; Kim, Regina E. Y.; Paulsen, Jane S.; Johnson, Hans J.
2016-01-01
The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%. PMID:27536233
Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J
2016-01-01
The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.
Test Plans and Procedures for the Baseline SAF for BDS-D Sites (ModSAF). Volume 2
1993-12-20
operations editor will no longer editor, appear in the EditorI Area. 64 I ADST/WDL/TR-93-W003271 VOLUME 2 of 2; Ver 1.0I 44200 Repeat steps 44120 thru...The unit operations 44200 to task the orange editor will no longer platoon to Move on the appear in the Editor route labeled "ort. Area. The vehicles
Strategic Studies Quarterly. Volume 2, Number 3, Fall 2008
2008-01-01
Managing Editor Betty R. Littlejohn, Editorial Assistant Jerry L. Gantt, Content Editor Sherry Terrell , Editorial Assistant Steven C. Garst...factsheet.asp?id=107 . Ibid. 9. Lt Col Sebastian M. Convertino II, CDR Lou Anne DeMattei, and Lt Col Tammy Knierim, Flying and Fighting in...PhD, Editor-in-Chief L. Tawanda Eaves, Managing Editor Betty R. Littlejohn, Editorial Assistant Jerry L. Gantt, Content Editor Sherry Terrell
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taneja, S; Bartol, L; Culberson, W
2015-06-15
Purpose: The calibration of radiation protection instrumentation including ionization chambers, scintillators, and Geiger Mueller (GM) counters used as survey meters are often done using {sup 137}Cs irradiators. During calibration, irradiators use a combination of attenuators with various thicknesses to modulate the beam to a known air-kerma rate. The variations in energy spectra as a result of these attenuators are not accounted for and may play a role in the energy-dependent response of survey meters. This study uses an experimentally validated irradiator geometry modeled in the MCNP5 transport code to characterize the effects of attenuation on the energy spectrum. Methods: Amore » Hopewell Designs G-10 {sup 137}Cs irradiator with lead attenuators of thicknesses of 0.635, 1.22, 2.22, and 4.32 cm, was used in this study. The irradiator geometry was modeled in MCNP5 and validated by comparing measured and simulated percent depth dose (PDD) and cross-field profiles. Variations in MCNP5 simulated spectra with increasing amounts of attenuation were characterized using the relative intensity of the 662 keV peak and the average energy. Results: Simulated and measured PDDs and profiles agreed within the associated uncertainty. The relative intensity of the 662 keV peak for simulated spectra normalized to the intensity of the unattenuated spectra ranged from 0.16% to 100% based on attenuation thickness. The average energy for simulated spectra for attenuators ranged from 582 keV with no attenuation to 653 keV with 5.54 cm of attenuation. Statistical uncertainty for MCNP5 simulations ranged from 0.11% to 3.69%. Conclusion: This study successfully used MCNP5 to validate a {sup 137}Cs irradiator geometry and characterize variations in energy spectra between different amounts of attenuation. Variations in the average energy of up to 12% were determined through simulations, and future work will aim to determine the effects of these differences on survey meter response and calibration.« less
MCNP5 CALCULATIONS REPLICATING ARH-600 NITRATE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
FINFROCK SH
This report serves to extend the previous document: 'MCNP Calculations Replicating ARH-600 Data' by replicating the nitrate curves found in ARH-600. This report includes the MCNP models used, the calculated critical dimension for each analyzed parameter set, and the resulting data libraries for use with the CritView code. As with the ARH-600 data, this report is not meant to replace the analysis of the fissile systems by qualified criticality personnel. The M CNP data is presented without accounting for the statistical uncertainty (although this is typically less than 0.001) or bias and, as such, the application of a reasonable safetymore » margin is required. The data that follows pertains to the uranyl nitrate and plutonium nitrate spheres, infinite cylinders, and infinite slabs of varying isotopic composition, reflector thickness, and molarity. Each of the cases was modeled in MCNP (version 5.1.40), using the ENDF/B-VI cross section set. Given a molarity, isotopic composition, and reflector thickness, the fissile concentration and diameter (or thicknesses in the case of the slab geometries) were varied. The diameter for which k-effective equals 1.00 for a given concentration could then be calculated and graphed. These graphs are included in this report. The pages that follow describe the regions modeled, formulas for calculating the various parameters, a list of cross-sections used in the calculations, a description of the automation routine and data, and finally the data output. The data of most interest are the critical dimensions of the various systems analyzed. This is presented graphically, and in table format, in Appendix B. Appendix C provides a text listing of the same data in a format that is compatible with the CritView code. Appendices D and E provide listing of example Template files and MCNP input files (these are discussed further in Section 4). Appendix F is a complete listing of all of the output data (i.e., all of the analyzed dimensions and the resulting k{sub eff} values).« less
Intrinsic Radiation Source Generation with the ISC Package: Data Comparisons and Benchmarking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, Clell J. Jr.
The characterization of radioactive emissions from unstable isotopes (intrinsic radiation) is necessary for shielding and radiological-dose calculations from radioactive materials. While most radiation transport codes, e.g., MCNP [X-5 Monte Carlo Team, 2003], provide the capability to input user prescribed source definitions, such as radioactive emissions, they do not provide the capability to calculate the correct radioactive-source definition given the material compositions. Special modifications to MCNP have been developed in the past to allow the user to specify an intrinsic source, but these modification have not been implemented into the primary source base [Estes et al., 1988]. To facilitate the descriptionmore » of the intrinsic radiation source from a material with a specific composition, the Intrinsic Source Constructor library (LIBISC) and MCNP Intrinsic Source Constructor (MISC) utility have been written. The combination of LIBISC and MISC will be herein referred to as the ISC package. LIBISC is a statically linkable C++ library that provides the necessary functionality to construct the intrinsic-radiation source generated by a material. Furthermore, LIBISC provides the ability use different particle-emission databases, radioactive-decay databases, and natural-abundance databases allowing the user flexibility in the specification of the source, if one database is preferred over others. LIBISC also provides functionality for aging materials and producing a thick-target bremsstrahlung photon source approximation from the electron emissions. The MISC utility links to LIBISC and facilitates the description of intrinsic-radiation sources into a format directly usable with the MCNP transport code. Through a series of input keywords and arguments the MISC user can specify the material, age the material if desired, and produce a source description of the radioactive emissions from the material in an MCNP readable format. Further details of using the MISC utility can be obtained from the user guide [Solomon, 2012]. The remainder of this report presents a discussion of the databases available to LIBISC and MISC, a discussion of the models employed by LIBISC, a comparison of the thick-target bremsstrahlung model employed, a benchmark comparison to plutonium and depleted-uranium spheres, and a comparison of the available particle-emission databases.« less
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less
Editorial independence at medical journals owned by professional associations: a survey of editors.
Davis, Ronald M; Müllner, Marcus
2002-10-01
The purpose of this study was to assess the degree of editorial independence at a sample of medical journals and the relationship between the journals and their owners. We surveyed the editors of 33 medical journals owned by not-for-profit organizations ("associations"), including 10 journals represented on the International Committee of Medical Journal Editors (nine of which are general medical journals) and a random sample of 23 specialist journals with high impact factors that are indexed by the Institute for Scientific Information. The main outcome measures were the authority to hire, fire, and oversee the work of the editor; the editor's tenure and financial compensation; control of the journal's budget; publication of material about the association; and the editor's perceptions about editorial independence and pressure over editorial content. Of the 33 editors, 23 (70%) reported having complete editorial freedom, and the remainder reported a high level of freedom (a score of > or = 8, where 10 equals complete editorial freedom and 1 equals no editorial freedom). Nevertheless, a substantial minority of editors reported having received at least some pressure in recent years over editorial content from the association's leadership (42%), senior staff (30%), or rank-and-file members (39%). The association's board of directors has the authority to hire (48%) or fire (55%) the editor for about half of the journals, and the editor reports to the board for 10 journals (30%). Twenty-three editors (70%) are appointed for a specific term (median term = 5 years). Three-fifths of the journals have no control over their profit, and the majority of journals use the association's legal counsel and/or media relations staff. Stronger safeguards are needed to give editors protection against pressure over editorial content, including written guarantees of editorial freedom and governance structures that support those guarantees. Strong safeguards are also needed because editors may have less freedom than they believe (especially if they have not yet tested their freedom in an area of controversy).
Reflections on 35 years with Applied Optics: outgoing editorial.
Mait, Joseph N
2014-10-20
Applied Optics' Editor-in-Chief, Joseph N. Mait reflects on his experience as a reader, author, reviewer and eventual editor of the journal. Dr. Mait also introduces the incoming Editor-in-Chief, Ronald G. Driggers and acknowledges outgoing Division Editor, T.-C. Poon.
KEGGParser: parsing and editing KEGG pathway maps in Matlab.
Arakelyan, Arsen; Nersisyan, Lilit
2013-02-15
KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.
2012-06-01
brianmccue@alum.mit.edu Letters to the Editor, John Willis, Augustine Consulting, Inc., jwillis@aciedge.com Modeling and Simulation , James N. Bexfield, FS, OSD...concepts that are now being applied to modern analytical thinking. The tuto- rials are free to MORS members and $75 for the day for nonmembers. The...Overview of Agent- based Modeling and Simulation and Complex Adaptive Systems • Visual Data Analysis • Analyzing Combat Identification • Guidelines for
On the development of radiation tolerant surveillance camera from consumer-grade components
NASA Astrophysics Data System (ADS)
Klemen, Ambrožič; Luka, Snoj; Lars, Öhlin; Jan, Gunnarsson; Niklas, Barringer
2017-09-01
In this paper an overview on the process of designing a radiation tolerant surveillance camera from consumer grade components and commercially available particle shielding materials is given. This involves utilization of Monte-Carlo particle transport code MCNP6 and ENDF/B-VII.0 nuclear data libraries, as well as testing the physical electrical systems against γ radiation, utilizing JSI TRIGA mk. II fuel elements as a γ-ray sources. A new, aluminum, 20 cm × 20 cm × 30 cm irradiation facility with electrical power and signal wire guide-tube to the reactor platform, was designed and constructed and used for irradiation of large electronic and optical components assemblies with activated fuel elements. Electronic components to be used in the camera were tested against γ-radiation in an independent manner, to determine their radiation tolerance. Several camera designs were proposed and simulated using MCNP, to determine incident particle and dose attenuation factors. Data obtained from the measurements and MCNP simulations will be used to finalize the design of 3 surveillance camera models, with different radiation tolerances.
Sheu, R J; Sheu, R D; Jiang, S H; Kao, C H
2005-01-01
Full-scale Monte Carlo simulations of the cyclotron room of the Buddhist Tzu Chi General Hospital were carried out to improve the original inadequate maze design. Variance reduction techniques are indispensable in this study to facilitate the simulations for testing a variety of configurations of shielding modification. The TORT/MCNP manual coupling approach based on the Consistent Adjoint Driven Importance Sampling (CADIS) methodology has been used throughout this study. The CADIS utilises the source and transport biasing in a consistent manner. With this method, the computational efficiency was increased significantly by more than two orders of magnitude and the statistical convergence was also improved compared to the unbiased Monte Carlo run. This paper describes the shielding problem encountered, the procedure for coupling the TORT and MCNP codes to accelerate the calculations and the calculation results for the original and improved shielding designs. In order to verify the calculation results and seek additional accelerations, sensitivity studies on the space-dependent and energy-dependent parameters were also conducted.
Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M
2005-01-01
This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed.
A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.
Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H
2001-03-01
The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.
Shahmohammadi Beni, Mehrdad; Ng, C Y P; Krstic, D; Nikezic, D; Yu, K N
2017-01-01
Radiotherapy is a common cancer treatment module, where a certain amount of dose will be delivered to the targeted organ. This is achieved usually by photons generated by linear accelerator units. However, radiation scattering within the patient's body and the surrounding environment will lead to dose dispersion to healthy tissues which are not targets of the primary radiation. Determination of the dispersed dose would be important for assessing the risk and biological consequences in different organs or tissues. In the present work, the concept of conversion coefficient (F) of the dispersed dose was developed, in which F = (Dd/Dt), where Dd was the dispersed dose in a non-targeted tissue and Dt is the absorbed dose in the targeted tissue. To quantify Dd and Dt, a comprehensive model was developed using the Monte Carlo N-Particle (MCNP) package to simulate the linear accelerator head, the human phantom, the treatment couch and the radiotherapy treatment room. The present work also demonstrated the feasibility and power of parallel computing through the use of the Message Passing Interface (MPI) version of MCNP5.
Krstic, D.; Nikezic, D.
2017-01-01
Radiotherapy is a common cancer treatment module, where a certain amount of dose will be delivered to the targeted organ. This is achieved usually by photons generated by linear accelerator units. However, radiation scattering within the patient’s body and the surrounding environment will lead to dose dispersion to healthy tissues which are not targets of the primary radiation. Determination of the dispersed dose would be important for assessing the risk and biological consequences in different organs or tissues. In the present work, the concept of conversion coefficient (F) of the dispersed dose was developed, in which F = (Dd/Dt), where Dd was the dispersed dose in a non-targeted tissue and Dt is the absorbed dose in the targeted tissue. To quantify Dd and Dt, a comprehensive model was developed using the Monte Carlo N-Particle (MCNP) package to simulate the linear accelerator head, the human phantom, the treatment couch and the radiotherapy treatment room. The present work also demonstrated the feasibility and power of parallel computing through the use of the Message Passing Interface (MPI) version of MCNP5. PMID:28362837
Background-Source Cosmic-Photon Elevation Scaling and Cosmic-Neutron/Photon Date Scaling in MCNP6
Tutt, James Robert; Anderson, Casey Alan; McKinney, Gregg Walter
2017-10-26
Here, cosmic neutron and photon fluxes are known to scale exponentially with elevation. Consequently, cosmic neutron elevation scaling was implemented for use with the background-source option shortly after its introduction into MCNP6, whereby the neutron flux weight factor was adjusted by the elevation scaling factor when the user-specified elevation differed from the selected background.dat grid-point elevation. At the same time, an elevation scaling factor was suggested for the cosmic photon flux, however, cosmic photon elevation scaling is complicated by the fact that the photon background consists of two components: cosmic and terrestrial. Previous versions of the background.dat file did notmore » provide any way to separate these components. With Rel. 4 of this file in 2015, two new columns were added that provide the energy grid and differential cosmic photon flux separately from the total photon flux. Here we show that the cosmic photon flux component can now be scaled independently and combined with the terrestrial component to form the total photon flux at a user-specified elevation in MCNP6.« less
NASA Astrophysics Data System (ADS)
Ortego, Pedro; Rodriguez, Alain; Töre, Candan; Compadre, José Luis de Diego; Quesada, Baltasar Rodriguez; Moreno, Raul Orive
2017-09-01
In order to increase the storage capacity of the East Spent Fuel Pool at the Cofrentes NPP, located in Valencia province, Spain, the existing storage stainless steel racks were replaced by a new design of compact borated stainless steel racks allowing a 65% increase in fuel storing capacity. Calculation of the activation of the used racks was successfully performed with the use of MCNP4B code. Additionally the dose rate at contact with a row of racks in standing position and behind a wall of shielding material has been calculated using MCNP4B code as well. These results allowed a preliminary definition of the burnker required for the storage of racks. Recently the activity in the racks has been recalculated with SEACAB system which combines the mesh tally of MCNP codes with the activation code ACAB, applying the rigorous two-step method (R2S) developed at home, benchmarked with FNG irradiation experiments and usually applied in fusion calculations for ITER project.
Background-Source Cosmic-Photon Elevation Scaling and Cosmic-Neutron/Photon Date Scaling in MCNP6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tutt, James Robert; Anderson, Casey Alan; McKinney, Gregg Walter
Here, cosmic neutron and photon fluxes are known to scale exponentially with elevation. Consequently, cosmic neutron elevation scaling was implemented for use with the background-source option shortly after its introduction into MCNP6, whereby the neutron flux weight factor was adjusted by the elevation scaling factor when the user-specified elevation differed from the selected background.dat grid-point elevation. At the same time, an elevation scaling factor was suggested for the cosmic photon flux, however, cosmic photon elevation scaling is complicated by the fact that the photon background consists of two components: cosmic and terrestrial. Previous versions of the background.dat file did notmore » provide any way to separate these components. With Rel. 4 of this file in 2015, two new columns were added that provide the energy grid and differential cosmic photon flux separately from the total photon flux. Here we show that the cosmic photon flux component can now be scaled independently and combined with the terrestrial component to form the total photon flux at a user-specified elevation in MCNP6.« less
Gamma-ray dose from an overhead plume
McNaughton, Michael W.; Gillis, Jessica McDonnel; Ruedig, Elizabeth; ...
2017-05-01
Standard plume models can underestimate the gamma-ray dose when most of the radioactive material is above the heads of the receptors. Typically, a model is used to calculate the air concentration at the height of the receptor, and the dose is calculated by multiplying the air concentration by a concentration-to-dose conversion factor. Models indicate that if the plume is emitted from a stack during stable atmospheric conditions, the lower edges of the plume may not reach the ground, in which case both the ground-level concentration and the dose are usually reported as zero. However, in such cases, the dose frommore » overhead gamma-emitting radionuclides may be substantial. Such underestimates could impact decision making in emergency situations. The Monte Carlo N-Particle code, MCNP, was used to calculate the overhead shine dose and to compare with standard plume models. At long distances and during unstable atmospheric conditions, the MCNP results agree with the standard models. As a result, at short distances, where many models calculate zero, the true dose (as modeled by MCNP) can be estimated with simple equations.« less
New Editors Appointed for Water Resources Research
NASA Astrophysics Data System (ADS)
2009-03-01
Praveen Kumar (University of Illinois at Urbana-Champaign), the newly appointed editor in chief of Water Resources Research (WRR), heads the new team of editors for the journal. The other editors are Tom Torgersen (University of Connecticut, Groton), who continues his editorship; Tissa Illangasekare (Colorado School of Mines, Golden); Graham Sander (Loughborough University, Loughborough, UK); and John Selker (Oregon State University, Corvallis). Hoshin Gupta (University of Arizona, Tucson) will join WRR at the end of 2009. The new editors will begin receiving submissions immediately. The incoming editorial board thanks outgoing editors Marc Parlange, Brian Berkowitz, Amilcare Porporato, and Scott Tyler, all of whom will assist during the transition.
ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows
NASA Technical Reports Server (NTRS)
McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush
2004-01-01
With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.
Standards opportunities around data-bearing Web pages.
Karger, David
2013-03-28
The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Calculation of the effective dose from natural radioactivity in soil using MCNP code.
Krstic, D; Nikezic, D
2010-01-01
Effective dose delivered by photon emitted from natural radioactivity in soil was calculated in this work. Calculations have been done for the most common natural radionuclides in soil (238)U, (232)Th series and (40)K. A ORNL human phantoms and the Monte Carlo transport code MCNP-4B were employed to calculate the energy deposited in all organs. The effective dose was calculated according to ICRP 74 recommendations. Conversion factors of effective dose per air kerma were determined. Results obtained here were compared with other authors. Copyright 2009 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Bryan Scott; Gough, Sean T.
This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.
Severe accident skyshine radiation analysis by MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eurajoki, T.
1994-12-31
If a severe accident with a considerable core damage occurs at a nuclear power plant whose containment top is remarkably thin compared with the walls, the radiation transported through the top and scattered in air may cause high dose rates at the power plant area. Noble gases and other fission products released to the containment act as sources. The dose rates caused by skyshine have been calculated by MCNP3A for the Loviisa nuclear power plant (two-unit, 445-MW VVER) for the outside area and inside some buildings, taking the attenuation in the roofs of the buildings into account.
Hybrid Skyshine Calculations for Complex Neutron and Gamma-Ray Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shultis, J. Kenneth
2000-10-15
A two-step hybrid method is described for computationally efficient estimation of neutron and gamma-ray skyshine doses far from a shielded source. First, the energy and angular dependence of radiation escaping into the atmosphere from a source containment is determined by a detailed transport model such as MCNP. Then, an effective point source with this energy and angular dependence is used in the integral line-beam method to transport the radiation through the atmosphere up to 2500 m from the source. An example spent-fuel storage cask is analyzed with this hybrid method and compared to detailed MCNP skyshine calculations.
Parameter dependence of the MCNP electron transport in determining dose distributions.
Reynaert, N; Palmans, H; Thierens, H; Jeraj, R
2002-10-01
In this paper, a detailed study of the electron transport in MCNP is performed, separating the effects of the energy binning technique on the energy loss rate, the scattering angles, and the sub-step length as a function of energy. As this problem is already well known, in this paper we focus on the explanation as to why the default mode of MCNP can lead to large deviations. The resolution dependence was investigated as well. An error in the MCNP code in the energy binning technique in the default mode (DBCN 18 card = 0) was revealed, more specific in the updating of cross sections when a sub-step is performed corresponding to a high-energy loss. This updating error is not present in the ITS mode (DBCN 18 card = 1) and leads to a systematically lower dose deposition rate in the default mode. The effect is present for all energies studied (0.5-10 MeV) and depends on the geometrical resolution of the scoring regions and the energy grid resolution. The effect of the energy binning technique is of the same order of that of the updating error for energies below 2 MeV, and becomes less important for higher energies. For a 1 MeV point source surrounded by homogeneous water, the deviation of the default MCNP results at short distances attains 9% and remains approximately the same for all energies. This effect could be corrected by removing the completion of an energy step each time an electron changes from an energy bin during a sub-step. Another solution consists of performing all calculations in the ITS mode. Another problem is the resolution dependence, even in the ITS mode. The higher the resolution is chosen (the smaller the scoring regions) the faster the energy is deposited along the electron track. It is proven that this is caused by starting a new energy step when crossing a surface. The resolution effect should be investigated for every specific case when calculating dose distributions around beta sources. The resolution should not be higher than 0.85*(1-EFAC)*CSDA, where EFAC is the energy loss per energy step and CSDA a continuous slowing down approximation range. This effect could as well be removed by determining the cross sections for energy loss and multiple scattering at the average energy of an energy step and by sampling the cross sections for each sub-step. Overall, we conclude that MCNP cannot be used without a caution due to possible errors in the electron transport. When care is taken, it is possible to obtain correct results that are in agreement with other Monte Carlo codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shilling, J.
1984-02-01
FRED, the friendly editor, is a screen-based structured editor. This manual is intended to serve the needs of a wide range of users of the FRED text editor. Most users will find it sufficient to read the introductory material in section 2, supplemented with the full command set description in section 3. Advanced users may wish to change the keystroke sequences which invoke editor commands. Section 4 describes how to change key bindings and how to define command macros. Some users may need to modify a language description or create an entirely new language description for use with FRED. Sectionmore » 5 describes the format of the language descriptions used by the editor, and describes how to construct a language grammar. Section 6 describes known portability problems of the FRED editor and should concern only system installation personnel. The editor points out syntax errors in the file being edited and does automatic pretty printing.« less
Wikipedia and Medicine: Quantifying Readership, Editors, and the Significance of Natural Language
West, Andrew G
2015-01-01
Background Wikipedia is a collaboratively edited encyclopedia. One of the most popular websites on the Internet, it is known to be a frequently used source of health care information by both professionals and the lay public. Objective This paper quantifies the production and consumption of Wikipedia’s medical content along 4 dimensions. First, we measured the amount of medical content in both articles and bytes and, second, the citations that supported that content. Third, we analyzed the medical readership against that of other health care websites between Wikipedia’s natural language editions and its relationship with disease prevalence. Fourth, we surveyed the quantity/characteristics of Wikipedia’s medical contributors, including year-over-year participation trends and editor demographics. Methods Using a well-defined categorization infrastructure, we identified medically pertinent English-language Wikipedia articles and links to their foreign language equivalents. With these, Wikipedia can be queried to produce metadata and full texts for entire article histories. Wikipedia also makes available hourly reports that aggregate reader traffic at per-article granularity. An online survey was used to determine the background of contributors. Standard mining and visualization techniques (eg, aggregation queries, cumulative distribution functions, and/or correlation metrics) were applied to each of these datasets. Analysis focused on year-end 2013, but historical data permitted some longitudinal analysis. Results Wikipedia’s medical content (at the end of 2013) was made up of more than 155,000 articles and 1 billion bytes of text across more than 255 languages. This content was supported by more than 950,000 references. Content was viewed more than 4.88 billion times in 2013. This makes it one of if not the most viewed medical resource(s) globally. The core editor community numbered less than 300 and declined over the past 5 years. The members of this community were half health care providers and 85.5% (100/117) had a university education. Conclusions Although Wikipedia has a considerable volume of multilingual medical content that is extensively read and well-referenced, the core group of editors that contribute and maintain that content is small and shrinking in size. PMID:25739399
Wikipedia and medicine: quantifying readership, editors, and the significance of natural language.
Heilman, James M; West, Andrew G
2015-03-04
Wikipedia is a collaboratively edited encyclopedia. One of the most popular websites on the Internet, it is known to be a frequently used source of health care information by both professionals and the lay public. This paper quantifies the production and consumption of Wikipedia's medical content along 4 dimensions. First, we measured the amount of medical content in both articles and bytes and, second, the citations that supported that content. Third, we analyzed the medical readership against that of other health care websites between Wikipedia's natural language editions and its relationship with disease prevalence. Fourth, we surveyed the quantity/characteristics of Wikipedia's medical contributors, including year-over-year participation trends and editor demographics. Using a well-defined categorization infrastructure, we identified medically pertinent English-language Wikipedia articles and links to their foreign language equivalents. With these, Wikipedia can be queried to produce metadata and full texts for entire article histories. Wikipedia also makes available hourly reports that aggregate reader traffic at per-article granularity. An online survey was used to determine the background of contributors. Standard mining and visualization techniques (eg, aggregation queries, cumulative distribution functions, and/or correlation metrics) were applied to each of these datasets. Analysis focused on year-end 2013, but historical data permitted some longitudinal analysis. Wikipedia's medical content (at the end of 2013) was made up of more than 155,000 articles and 1 billion bytes of text across more than 255 languages. This content was supported by more than 950,000 references. Content was viewed more than 4.88 billion times in 2013. This makes it one of if not the most viewed medical resource(s) globally. The core editor community numbered less than 300 and declined over the past 5 years. The members of this community were half health care providers and 85.5% (100/117) had a university education. Although Wikipedia has a considerable volume of multilingual medical content that is extensively read and well-referenced, the core group of editors that contribute and maintain that content is small and shrinking in size.
EDT mode for JED -- An advanced Unix text editor
NASA Astrophysics Data System (ADS)
McIlwrath, B. K.; Page, C. G.
This note describes Starlink extended EDT emulation for the JED editor. It provides a Unix text editor which can utilise the advanced facilities of DEC VTn00, xterm and similar terminals. JED in this mode provides a reasonably good emulation of the VAX/VMS editor EDT in addition to many extra facilities.
Becoming an Online Editor: Perceived Roles and Responsibilities of Wikipedia Editors
ERIC Educational Resources Information Center
Littlejohn, Allison; Hood, Nina
2018-01-01
Introduction: We report on the experiences of a group of people as they become Wikipedia editors. We test Benkler's (2002) theory that commons-based production processes accelerate the creation of capital, questioning what knowledge production processes do people engage in as they become editors? The analysis positions the development of editing…
Emerging Perspectives on Editorial Ethics: An Interview with Chris Higgins
ERIC Educational Resources Information Center
Jackson, Liz
2017-01-01
Chris Higgins took on the roles of Editor of "Educational Theory," and Editor-in-Chief of the "Philosophy of Education Yearbook" published by the Philosophy of Education Society, in 2013, after having been an Associate Editor and Book Review Editor for "Educational Theory" for six years. Higgins worked closely with…
Authors and editors assort on gender and geography in high-rank ecological publications
Belou, Rebecca M.
2018-01-01
Peer-reviewed publication volume and caliber are widely-recognized proxies for academic merit, and a strong publication record is essential for academic success and advancement. However, recent work suggests that publication productivity for particular author groups may also be determined in part by implicit biases lurking in the publication pipeline. Here, we explore patterns of gender, geography, and institutional rank among authors, editorial board members, and handling editors in high-impact ecological publications during 2015 and 2016. A higher proportion of lead authors had female first names (33.9%) than editorial board members (28.9%), and the proportion of female first names among handling editors was even lower (21.1%). Female editors disproportionately edited publications with female lead authors (40.3% of publications with female lead authors were handled by female editors, though female editors handled only 34.4% of all studied publications). Additionally, ecological authors and editors were overwhelmingly from countries in the G8, and high-ranking academic institutions accounted for a large portion of both the published work, and its editorship. Editors and lead authors with female names were typically affiliated with higher-ranking institutions than their male peers. This description of author and editor features provides a baseline for benchmarking future trends in the ecological publishing culture. PMID:29420647
Authors and editors assort on gender and geography in high-rank ecological publications.
Manlove, Kezia R; Belou, Rebecca M
2018-01-01
Peer-reviewed publication volume and caliber are widely-recognized proxies for academic merit, and a strong publication record is essential for academic success and advancement. However, recent work suggests that publication productivity for particular author groups may also be determined in part by implicit biases lurking in the publication pipeline. Here, we explore patterns of gender, geography, and institutional rank among authors, editorial board members, and handling editors in high-impact ecological publications during 2015 and 2016. A higher proportion of lead authors had female first names (33.9%) than editorial board members (28.9%), and the proportion of female first names among handling editors was even lower (21.1%). Female editors disproportionately edited publications with female lead authors (40.3% of publications with female lead authors were handled by female editors, though female editors handled only 34.4% of all studied publications). Additionally, ecological authors and editors were overwhelmingly from countries in the G8, and high-ranking academic institutions accounted for a large portion of both the published work, and its editorship. Editors and lead authors with female names were typically affiliated with higher-ranking institutions than their male peers. This description of author and editor features provides a baseline for benchmarking future trends in the ecological publishing culture.
JSME: a free molecule editor in JavaScript.
Bienfait, Bruno; Ertl, Peter
2013-01-01
A molecule editor, i.e. a program facilitating graphical input and interactive editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. Today, when a web browser has become the universal scientific user interface, a tool to edit molecules directly within the web browser is essential. One of the most popular tools for molecular structure input on the web is the JME applet. Since its release nearly 15 years ago, however the web environment has changed and Java applets are facing increasing implementation hurdles due to their maintenance and support requirements, as well as security issues. This prompted us to update the JME editor and port it to a modern Internet programming language - JavaScript. The actual molecule editing Java code of the JME editor was translated into JavaScript with help of the Google Web Toolkit compiler and a custom library that emulates a subset of the GUI features of the Java runtime environment. In this process, the editor was enhanced by additional functionalities including a substituent menu, copy/paste, drag and drop and undo/redo capabilities and an integrated help. In addition to desktop computers, the editor supports molecule editing on touch devices, including iPhone, iPad and Android phones and tablets. In analogy to JME the new editor is named JSME. This new molecule editor is compact, easy to use and easy to incorporate into web pages. A free molecule editor written in JavaScript was developed and is released under the terms of permissive BSD license. The editor is compatible with JME, has practically the same user interface as well as the web application programming interface. The JSME editor is available for download from the project web page http://peter-ertl.com/jsme/
Lung Dosimetry for Radioiodine Treatment Planning in the Case of Diffuse Lung Metastases
Song, Hong; He, Bin; Prideaux, Andrew; Du, Yong; Frey, Eric; Kasecamp, Wayne; Ladenson, Paul W.; Wahl, Richard L.; Sgouros, George
2010-01-01
The lungs are the most frequent sites of distant metastasis in differentiated thyroid carcinoma. Radioiodine treatment planning for these patients is usually performed following the Benua– Leeper method, which constrains the administered activity to 2.96 GBq (80 mCi) whole-body retention at 48 h after administration to prevent lung toxicity in the presence of iodine-avid lung metastases. This limit was derived from clinical experience, and a dosimetric analysis of lung and tumor absorbed dose would be useful to understand the implications of this limit on toxicity and tumor control. Because of highly nonuniform lung density and composition as well as the nonuniform activity distribution when the lungs contain tumor nodules, Monte Carlo dosimetry is required to estimate tumor and normal lung absorbed dose. Reassessment of this toxicity limit is also appropriate in light of the contemporary use of recombinant thyrotropin (thyroid-stimulating hormone) (rTSH) to prepare patients for radioiodine therapy. In this work we demonstrated the use of MCNP, a Monte Carlo electron and photon transport code, in a 3-dimensional (3D) imaging–based absorbed dose calculation for tumor and normal lungs. Methods A pediatric thyroid cancer patient with diffuse lung metastases was administered 37MBq of 131I after preparation with rTSH. SPECT/CT scans were performed over the chest at 27, 74, and 147 h after tracer administration. The time–activity curve for 131I in the lungs was derived from the whole-body planar imaging and compared with that obtained from the quantitative SPECT methods. Reconstructed and coregistered SPECT/CT images were converted into 3D density and activity probability maps suitable for MCNP4b input. Absorbed dose maps were calculated using electron and photon transport in MCNP4b. Administered activity was estimated on the basis of the maximum tolerated dose (MTD) of 27.25 Gy to the normal lungs. Computational efficiency of the MCNP4b code was studied with a simple segmentation approach. In addition, the Benua–Leeper method was used to estimate the recommended administered activity. The standard dosing plan was modified to account for the weight of this pediatric patient, where the 2.96-GBq (80 mCi) whole-body retention was scaled to 2.44 GBq (66 mCi) to give the same dose rate of 43.6 rad/h in the lungs at 48 h. Results Using the MCNP4b code, both the spatial dose distribution and a dose–volume histogram were obtained for the lungs. An administered activity of 1.72 GBq (46.4 mCi) delivered the putative MTD of 27.25 Gy to the lungs with a tumor absorbed dose of 63.7 Gy. Directly applying the Benua–Leeper method, an administered activity of 3.89 GBq (105.0 mCi) was obtained, resulting in tumor and lung absorbed doses of 144.2 and 61.6 Gy, respectively, when the MCNP-based dosimetry was applied. The voxel-by-voxel calculation time of 4,642.3 h for photon transport was reduced to 16.8 h when the activity maps were segmented into 20 regions. Conclusion MCNP4b–based, patient-specific 3D dosimetry is feasible and important in the dosimetry of thyroid cancer patients with avid lung metastases that exhibit prolonged retention in the lungs. PMID:17138741
Preparing Students To Work on Newspaper Copy Desks: Are Educators Meeting Editors' Expectations?
ERIC Educational Resources Information Center
Auman, Ann E.; Cook, Betsy B.
A study surveyed two groups in the fall of 1994, journalism educators and newspaper editors. Educators completed a survey regarding the course content and skill areas emphasized in beginning level copy editing courses, while editors were asked to respond to questions regarding the skills they expect entry-level copy editors to have. Respondents…
Web-Based Media Contents Editor for UCC Websites
NASA Astrophysics Data System (ADS)
Kim, Seoksoo
The purpose of this research is to "design web-based media contents editor for establishing UCC(User Created Contents)-based websites." The web-based editor features user-oriented interfaces and increased convenience, significantly different from previous off-line editors. It allows users to edit media contents online and can be effectively used for online promotion activities of enterprises and organizations. In addition to development of the editor, the research aims to support the entry of enterprises and public agencies to the online market by combining the technology with various UCC items.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Background-Source Cosmic-Photon Elevation Scaling and Cosmic-Neutron/Photon Date Scaling in MCNP6
NASA Astrophysics Data System (ADS)
Tutt, J.; Anderson, C.; McKinney, G.
Cosmic neutron and photon fluxes are known to scale exponentially with elevation. Consequently, cosmic neutron elevation scaling was implemented for use with the background-source option shortly after its introduction into MCNP6, whereby the neutron flux weight factor was adjusted by the elevation scaling factor when the user-specified elevation differed from the selected background.dat grid-point elevation. At the same time, an elevation scaling factor was suggested for the cosmic photon flux, however, cosmic photon elevation scaling is complicated by the fact that the photon background consists of two components: cosmic and terrestrial. Previous versions of the background.dat file did not provide any way to separate these components. With Rel. 4 of this file in 2015, two new columns were added that provide the energy grid and differential cosmic photon flux separately from the total photon flux. Here we show that the cosmic photon flux component can now be scaled independently and combined with the terrestrial component to form the total photon flux at a user-specified elevation in MCNP6. Cosmic background fluxes also scale with the solar cycle due to solar modulation. This modulation has been shown to be nearly sinusoidal over time, with an inverse effect - increased modulation leads to a decrease in cosmic fluxes. This effect was initially included with the cosmic source option in MCNP6 and has now been extended for use with the background source option when: (1) the date is specified in the background.dat file, and (2) when the user specifies a date on the source definition card. A description of the cosmic-neutron/photon date scaling feature will be presented along with scaling results for past and future date extrapolations.
Monte Carlo N Particle code - Dose distribution of clinical electron beams in inhomogeneous phantoms
Nedaie, H. A.; Mosleh-Shirazi, M. A.; Allahverdi, M.
2013-01-01
Electron dose distributions calculated using the currently available analytical methods can be associated with large uncertainties. The Monte Carlo method is the most accurate method for dose calculation in electron beams. Most of the clinical electron beam simulation studies have been performed using non- MCNP [Monte Carlo N Particle] codes. Given the differences between Monte Carlo codes, this work aims to evaluate the accuracy of MCNP4C-simulated electron dose distributions in a homogenous phantom and around inhomogeneities. Different types of phantoms ranging in complexity were used; namely, a homogeneous water phantom and phantoms made of polymethyl methacrylate slabs containing different-sized, low- and high-density inserts of heterogeneous materials. Electron beams with 8 and 15 MeV nominal energy generated by an Elekta Synergy linear accelerator were investigated. Measurements were performed for a 10 cm × 10 cm applicator at a source-to-surface distance of 100 cm. Individual parts of the beam-defining system were introduced into the simulation one at a time in order to show their effect on depth doses. In contrast to the first scattering foil, the secondary scattering foil, X and Y jaws and applicator provide up to 5% of the dose. A 2%/2 mm agreement between MCNP and measurements was found in the homogenous phantom, and in the presence of heterogeneities in the range of 1-3%, being generally within 2% of the measurements for both energies in a "complex" phantom. A full-component simulation is necessary in order to obtain a realistic model of the beam. The MCNP4C results agree well with the measured electron dose distributions. PMID:23533162
Monte Carlo simulation of x-ray spectra in diagnostic radiology and mammography using MCNP4C
NASA Astrophysics Data System (ADS)
Ay, M. R.; Shahriari, M.; Sarkar, S.; Adib, M.; Zaidi, H.
2004-11-01
The general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C) was used for the simulation of x-ray spectra in diagnostic radiology and mammography. The electrons were transported until they slow down and stop in the target. Both bremsstrahlung and characteristic x-ray production were considered in this work. We focus on the simulation of various target/filter combinations to investigate the effect of tube voltage, target material and filter thickness on x-ray spectra in the diagnostic radiology and mammography energy ranges. The simulated x-ray spectra were compared with experimental measurements and spectra calculated by IPEM report number 78. In addition, the anode heel effect and off-axis x-ray spectra were assessed for different anode angles and target materials and the results were compared with EGS4-based Monte Carlo simulations and measured data. Quantitative evaluation of the differences between our Monte Carlo simulated and comparison spectra was performed using student's t-test statistical analysis. Generally, there is a good agreement between the simulated x-ray and comparison spectra, although there are systematic differences between the simulated and reference spectra especially in the K-characteristic x-rays intensity. Nevertheless, no statistically significant differences have been observed between IPEM spectra and the simulated spectra. It has been shown that the difference between MCNP simulated spectra and IPEM spectra in the low energy range is the result of the overestimation of characteristic photons following the normalization procedure. The transmission curves produced by MCNP4C have good agreement with the IPEM report especially for tube voltages of 50 kV and 80 kV. The systematic discrepancy for higher tube voltages is the result of systematic differences between the corresponding spectra.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 3 2013-07-01 2013-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 3 2011-07-01 2011-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 3 2012-07-01 2012-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 3 2010-07-01 2010-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 3 2014-07-01 2014-07-01 false âNews editor.â 793.8 Section 793.8 Labor Regulations... Exemption § 793.8 “News editor.” A news editor is an employee who gathers, edits and rewrites the news. He may also select and prepare news items for broadcast and present the news on the air. An employee who...
ERIC Educational Resources Information Center
Xun, Gong
2007-01-01
Against the backdrop of the grave academic crisis in China, editors have become the objects of wooing, favor-currying, connections-seeking, and collusions; they have been targeted for attacks, plots, extortions, and encroachments. Editing and publishing have become avenues for academic irregularities and academic corruption. Editors have the power…
Efficiency of whole-body counter for various body size calculated by MCNP5 software.
Krstic, D; Nikezic, D
2012-11-01
The efficiency of a whole-body counter for (137)Cs and (40)K was calculated using the MCNP5 code. The ORNL phantoms of a human body of different body sizes were applied in a sitting position in front of a detector. The aim was to investigate the dependence of efficiency on the body size (age) and the detector position with respect to the body and to estimate the accuracy of real measurements. The calculation work presented here is related to the NaI detector, which is available in the Serbian Whole-body Counter facility in Vinca Institute.
Shielding analysis of the Microtron MT-25 bunker using the MCNP-4C code and NCRP Report 51.
Casanova, A O; López, N; Gelen, A; Guevara, M V Manso; Díaz, O; Cimino, L; D'Alessandro, K; Melo, J C
2004-01-01
A cyclic electron accelerator Microtron MT-25 will be installed in Havana, Cuba. Electrons, neutrons and gamma radiation up to 25 MeV can be produced in the MT-25. A detailed shielding analysis for the bunker is carried out using two ways: the NCRP-51 Report and the Monte Carlo Method (MCNP-4C Code). The walls and ceiling thicknesses are estimated with dose constraints of 0.5 and 20 mSv y(-1), respectively, and an area occupancy factor of 1/16. Both results are compared and a preliminary bunker design is shown. Copyright 2004 Oxford University Press
Dose mapping using MCNP code and experiment for SVST-Co-60/B irradiator in Vietnam.
Tran, Van Hung; Tran, Khac An
2010-06-01
By using MCNP code and ethanol-chlorobenzene (ECB) dosimeters the simulations and measurements of absorbed dose distribution in a tote-box of the Cobalt-60 irradiator, SVST-Co60/B at VINAGAMMA have been done. Based on the results Dose Uniformity Ratios (DUR), positions and values of minimum and maximum dose extremes in a tote-box, and efficiency of the irradiator for the different dummy densities have been gained. There is a good agreement between simulation and experimental results in comparison and they have valuable meanings for operation of the irradiator. Copyright 2010 Elsevier Ltd. All rights reserved.
Radulović, Vladimir; Štancar, Žiga; Snoj, Luka; Trkov, Andrej
2014-02-01
The calculation of axial neutron flux distributions with the MCNP code at the JSI TRIGA Mark II reactor has been validated with experimental measurements of the (197)Au(n,γ)(198)Au reaction rate. The calculated absolute reaction rate values, scaled according to the reactor power and corrected for the flux redistribution effect, are in good agreement with the experimental results. The effect of different cross-section libraries on the calculations has been investigated and shown to be minor. Copyright © 2013 Elsevier Ltd. All rights reserved.
Covariance Data File Formats for Whisper-1.0 & Whisper-1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan
2017-01-09
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014. During 2015-2016, Whisper was updated to version 1.1 and is to be included with the upcoming release of MCNP6.2. This report describes the file formats used for the covariance data in both Whisper-1.0 and Whisper-1.1.
MCNP calculations for container inspection with tagged neutrons
NASA Astrophysics Data System (ADS)
Boghen, G.; Donzella, A.; Filippini, V.; Fontana, A.; Lunardon, M.; Moretto, S.; Pesente, S.; Zenoni, A.
2005-12-01
We are developing an innovative tagged neutrons inspection system (TNIS) for cargo containers: the system will allow us to assay the chemical composition of suspect objects, previously identified by a standard X-ray radiography. The operation of the system is extensively being simulated by using the MCNP Monte Carlo code to study different inspection geometries, cargo loads and hidden threat materials. Preliminary simulations evaluating the signal and the signal over background ratio expected as a function of the system parameters are presented. The results for a selection of cases are briefly discussed and demonstrate that the system can operate successfully in different filling conditions.
Anisn-Dort Neutron-Gamma Flux Intercomparison Exercise for a Simple Testing Model
NASA Astrophysics Data System (ADS)
Boehmer, B.; Konheiser, J.; Borodkin, G.; Brodkin, E.; Egorov, A.; Kozhevnikov, A.; Zaritsky, S.; Manturov, G.; Voloschenko, A.
2003-06-01
The ability of transport codes ANISN, DORT, ROZ-6, MCNP and TRAMO, as well as nuclear data libraries BUGLE-96, ABBN-93, VITAMIN-B6 and ENDF/B-6 to deliver consistent gamma and neutron flux results was tested in the calculation of a one-dimensional cylindrical model consisting of a homogeneous core and an outer zone with a single material. Model variants with H2O, Fe, Cr and Ni in the outer zones were investigated. The results are compared with MCNP-ENDF/B-6 results. Discrepancies are discussed. The specified test model is proposed as a computational benchmark for testing calculation codes and data libraries.
Skyshine line-beam response functions for 20- to 100-MeV photons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brockhoff, R.C.; Shultis, J.K.; Faw, R.E.
1996-06-01
The line-beam response function, needed for skyshine analyses based on the integral line-beam method, was evaluated with the MCNP Monte Carlo code for photon energies from 20 to 100 MeV and for source-to-detector distances out to 1,000 m. These results are compared with point-kernel results, and the effects of bremsstrahlung and positron transport in the air are found to be important in this energy range. The three-parameter empirical formula used in the integral line-beam skyshine method was fit to the MCNP results, and values of these parameters are reported for various source energies and angles.
A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics
NASA Astrophysics Data System (ADS)
Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger
2017-09-01
Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.
Measurement and simulation of thermal neutron flux distribution in the RTP core
NASA Astrophysics Data System (ADS)
Rabir, Mohamad Hairie B.; Jalal Bayar, Abi Muttaqin B.; Hamzah, Na'im Syauqi B.; Mustafa, Muhammad Khairul Ariff B.; Karim, Julia Bt. Abdul; Zin, Muhammad Rawi B. Mohamed; Ismail, Yahya B.; Hussain, Mohd Huzair B.; Mat Husin, Mat Zin B.; Dan, Roslan B. Md; Ismail, Ahmad Razali B.; Husain, Nurfazila Bt.; Jalil Khan, Zareen Khan B. Abdul; Yakin, Shaiful Rizaide B. Mohd; Saad, Mohamad Fauzi B.; Masood, Zarina Bt.
2018-01-01
The in-core thermal neutron flux distribution was determined using measurement and simulation methods for the Malaysian’s PUSPATI TRIGA Reactor (RTP). In this work, online thermal neutron flux measurement using Self Powered Neutron Detector (SPND) has been performed to verify and validate the computational methods for neutron flux calculation in RTP calculations. The experimental results were used as a validation to the calculations performed with Monte Carlo code MCNP. The detail in-core neutron flux distributions were estimated using MCNP mesh tally method. The neutron flux mapping obtained revealed the heterogeneous configuration of the core. Based on the measurement and simulation, the thermal flux profile peaked at the centre of the core and gradually decreased towards the outer side of the core. The results show a good agreement (relatively) between calculation and measurement where both show the same radial thermal flux profile inside the core: MCNP model over estimation with maximum discrepancy around 20% higher compared to SPND measurement. As our model also predicts well the neutron flux distribution in the core it can be used for the characterization of the full core, that is neutron flux and spectra calculation, dose rate calculations, reaction rate calculations, etc.
Beam Characterization at the Neutron Radiography Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarah Morgan; Jeffrey King
The quality of a neutron imaging beam directly impacts the quality of radiographic images produced using that beam. Fully characterizing a neutron beam, including determination of the beam’s effective length-to-diameter ratio, neutron flux profile, energy spectrum, image quality, and beam divergence, is vital for producing quality radiographic images. This project characterized the east neutron imaging beamline at the Idaho National Laboratory Neutron Radiography Reactor (NRAD). The experiments which measured the beam’s effective length-to-diameter ratio and image quality are based on American Society for Testing and Materials (ASTM) standards. An analysis of the image produced by a calibrated phantom measured themore » beam divergence. The energy spectrum measurements consist of a series of foil irradiations using a selection of activation foils, compared to the results produced by a Monte Carlo n-Particle (MCNP) model of the beamline. Improvement of the existing NRAD MCNP beamline model includes validation of the model’s energy spectrum and the development of enhanced image simulation methods. The image simulation methods predict the radiographic image of an object based on the foil reaction rate data obtained by placing a model of the object in front of the image plane in an MCNP beamline model.« less
The X6XS. 0 cross section library for MCNP-4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pruvost, N.L.; Seamon, R.E.; Rombaugh, C.T.
1991-06-01
This report documents the work done by X-6, HSE-6, and CTR Technical Services to produce a comprehensive working cross-section library for MCNP-4 suitable for SUN workstations and similar environments. The resulting library consists of a total of 436 files (one file for each ZAID). The library is 152 Megabytes in Type 1 format and 32 Megabytes in Type 2 format. Type 2 can be used when porting the library from one computer to another of the same make. Otherwise, Type 1 must be used to ensure portability between different computer systems. Instructions for installing the library and adding ZAIDs tomore » it are included here. Also included is a description of the steps necessary to install and test version 4 of MCNP. To improve readability of this report, certain commands and filenames are given in uppercase letters. The actual command or filename on the SUN workstation, however, must be specified in lowercase letters. Any questions regarding the data contained in the library should be directed to X-6 and any questions regarding the installation of the library and the testing that was performed should be directed to HSE-6. 9 refs., 7 tabs.« less
NASA Astrophysics Data System (ADS)
Antoni, Rodolphe; Bourgois, Laurent
2017-12-01
In this work, the calculation of specific dose distribution in water is evaluated in MCNP6.1 with the regular condensed history algorithm the "detailed electron energy-loss straggling logic" and the new electrons transport algorithm proposed the "single event algorithm". Dose Point Kernel (DPK) is calculated with monoenergetic electrons of 50, 100, 500, 1000 and 3000 keV for different scoring cells dimensions. A comparison between MCNP6 results and well-validated codes for electron-dosimetry, i.e., EGSnrc or Penelope, is performed. When the detailed electron energy-loss straggling logic is used with default setting (down to the cut-off energy 1 keV), we infer that the depth of the dose peak increases with decreasing thickness of the scoring cell, largely due to combined step-size and boundary crossing artifacts. This finding is less prominent for 500 keV, 1 MeV and 3 MeV dose profile. With an appropriate number of sub-steps (ESTEP value in MCNP6), the dose-peak shift is almost complete absent to 50 keV and 100 keV electrons. However, the dose-peak is more prominent compared to EGSnrc and the absorbed dose tends to be underestimated at greater depths, meaning that boundaries crossing artifact are still occurring while step-size artifacts are greatly reduced. When the single-event mode is used for the whole transport, we observe the good agreement of reference and calculated profile for 50 and 100 keV electrons. Remaining artifacts are fully vanished, showing a possible transport treatment for energies less than a hundred of keV and accordance with reference for whatever scoring cell dimension, even if the single event method initially intended to support electron transport at energies below 1 keV. Conversely, results for 500 keV, 1 MeV and 3 MeV undergo a dramatic discrepancy with reference curves. These poor results and so the current unreliability of the method is for a part due to inappropriate elastic cross section treatment from the ENDF/B-VI.8 library in those energy ranges. Accordingly, special care has to be taken in setting choice for calculating electron dose distribution with MCNP6, in particular with regards to dosimetry or nuclear medicine applications.
NASA Astrophysics Data System (ADS)
Lodwick, Camille J.
This research utilized Monte Carlo N-Particle version 4C (MCNP4C) to simulate K X-ray fluorescent (K XRF) measurements of stable lead in bone. Simulations were performed to investigate the effects that overlying tissue thickness, bone-calcium content, and shape of the calibration standard have on detector response in XRF measurements at the human tibia. Additional simulations of a knee phantom considered uncertainty associated with rotation about the patella during XRF measurements. Simulations tallied the distribution of energy deposited in a high-purity germanium detector originating from collimated 88 keV 109Cd photons in backscatter geometry. Benchmark measurements were performed on simple and anthropometric XRF calibration phantoms of the human leg and knee developed at the University of Cincinnati with materials proven to exhibit radiological characteristics equivalent to human tissue and bone. Initial benchmark comparisons revealed that MCNP4C limits coherent scatter of photons to six inverse angstroms of momentum transfer and a Modified MCNP4C was developed to circumvent the limitation. Subsequent benchmark measurements demonstrated that Modified MCNP4C adequately models photon interactions associated with in vivo K XRF of lead in bone. Further simulations of a simple leg geometry possessing tissue thicknesses from 0 to 10 mm revealed increasing overlying tissue thickness from 5 to 10 mm reduced predicted lead concentrations an average 1.15% per 1 mm increase in tissue thickness (p < 0.0001). An anthropometric leg phantom was mathematically defined in MCNP to more accurately reflect the human form. A simulated one percent increase in calcium content (by mass) of the anthropometric leg phantom's cortical bone demonstrated to significantly reduce the K XRF normalized ratio by 4.5% (p < 0.0001). Comparison of the simple and anthropometric calibration phantoms also suggested that cylindrical calibration standards can underestimate lead content of a human leg up to 4%. The patellar bone structure in which the fluorescent photons originate was found to vary dramatically with measurement angle. The relative contribution of lead signal from the patella declined from 65% to 27% when rotated 30°. However, rotation of the source-detector about the patella from 0 to 45° demonstrated no significant effect on the net K XRF response at the knee.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Morgan C.
2000-07-01
The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a selectmore » group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to calculate radiation dose due to the neutron environment around a MEA is shown. An uncertainty of a factor of three in the MEA calculations is shown to be due to uncertainties in the geometry modeling. It is believed that the methodology is sound and that good agreement between simulation and experiment has been demonstrated.« less
NASA Astrophysics Data System (ADS)
Poškus, A.
2016-09-01
This paper evaluates the accuracy of the single-event (SE) and condensed-history (CH) models of electron transport in MCNP6.1 when simulating characteristic Kα, total K (=Kα + Kβ) and Lα X-ray emission from thick targets bombarded by electrons with energies from 5 keV to 30 keV. It is shown that the MCNP6.1 implementation of the CH model for the K-shell impact ionization leads to underestimation of the K yield by 40% or more for the elements with atomic numbers Z < 15 and overestimation of the Kα yield by more than 40% for the elements with Z > 25. The Lα yields are underestimated by more than an order of magnitude in CH mode, because MCNP6.1 neglects X-ray emission caused by electron-impact ionization of L, M and higher shells in CH mode (the Lα yields calculated in CH mode reflect only X-ray fluorescence, which is mainly caused by photoelectric absorption of bremsstrahlung photons). The X-ray yields calculated by MCNP6.1 in SE mode (using ENDF/B-VII.1 library data) are more accurate: the differences of the calculated and experimental K yields are within the experimental uncertainties for the elements C, Al and Si, and the calculated Kα yields are typically underestimated by (20-30)% for the elements with Z > 25, whereas the Lα yields are underestimated by (60-70)% for the elements with Z > 49. It is also shown that agreement of the experimental X-ray yields with those calculated in SE mode is additionally improved by replacing the ENDF/B inner-shell electron-impact ionization cross sections with the set of cross sections obtained from the distorted-wave Born approximation (DWBA), which are also used in the PENELOPE code system. The latter replacement causes a decrease of the average relative difference of the experimental X-ray yields and the simulation results obtained in SE mode to approximately 10%, which is similar to accuracy achieved with PENELOPE. This confirms that the DWBA inner-shell impact ionization cross sections are significantly more accurate than the corresponding ENDF/B cross sections when energy of incident electrons is of the order of the binding energy.
NASA Astrophysics Data System (ADS)
Clayton, C. A.
The purpose of this document is to give new users advice on how to choose which editor to use on Unix machines. Under Unix the default editors are considered to be unfriendly and many users prefer to use other more sophisticated alternatives. However, many such alternatives exist; there is not one single editor that everyone finds acceptable and hence each user must decide for himself or herself which to adopt.
ERIC Educational Resources Information Center
English, John W.
This report chronicles the first Society of Magazine Editors' educators seminar, which was held in New York from May 13-17, 1974, and was attended by ten journalism faculty. The industry's concerns, as expressed through editors, are paper, printing, postage, people, and profit. The Magazine Publishers Association (MPA) seems mostly concerned with…
Views of Iranian medical journal editors on medical research publication.
Etemadi, Arash; Raiszadeh, Farbod; Alaeddini, Farshid; Azizi, Fereidoun
2004-01-01
Medical journal editors play an important role in optimizing research publication. This study evaluates the views of Iranian medical journal editors, and their knowledge of medical publication standards. In May 2001, 51 editors from all journals approved by the Ministry of Health were invited to participate, 27 of whom completed the study. A self-administered questionnaire, based on the Uniform Requirements for Manuscripts Submitted to Biomedical Journals (URMS) was used which consisted of 28 questions in 9 subject fields. These fields included: peer review, conflicts of interest, authorship criteria, publication ethics, duplicate publication, mass media, advertising, competing manuscripts, and the Internet. The knowledge of the editors was assessed by a scoring system, with a range of -46 to +44 points. Twenty-three of the participants were editors-in-chief and 4 were managing editors. Their average age was 47.3 +/- 8.7 years and 25 were male. All journals were peer-reviewed, most having 2 or 3 reviewers for each manuscript. Of the journals, 92.6% accepted or rejected an article on the basis of the views of most reviewers and 52%, sometimes or always, used a statistician as a reviewer. Most of the editors believed that writing the first draft and designing the study are authorship criteria, and most of them believed that these 2 are stated in URMS. Seven journals (25.9%) never published advertisements. Among journals that sold advertisements, the most popular policy (85%) was the rejection of advertisements because they advertised harmful products. Out of 27 journals, 12 were accessible on the Internet, and 7 had independent websites. Of the editors, 81.5% thought that a website is useful for their journal. The average knowledge score of the editors was 6.5 +/- 7.5. None had a negative score, 33% scored zero, 45% obtained average scores and 22% obtained good scores. The results show that peer review is favored by all the editors studied, though it seems that journals do not follow clear-cut policies in this regard. Most of the editors, agreed with the statements of URMS to some extent and generally most have average to high knowledge of URMS.
Eaton, Kenneth A; Rex Holland, G; Giannobile, William V; Hancocks, Stephen; Robinson, Peter G; Lynch, Christopher D
2014-03-01
On March 20th 2013, a one-hour session for Editors, Associate Editors, Publishers and others with an interest in scientific publishing was held at the IADR International Session in Seattle. Organised by Kenneth Eaton and Christopher Lynch (Chair and Secretary, respectively, of the British Dental Editors Forum), the meeting sought to bring together leading international experts in dental publishing, as well as authors, reviewers and students engaged in research. The meeting was an overwhelming success, with more than 100 attendees. A panel involving four leading dental editors led a discussion on anticipated developments in publishing dental research with much involvement and contribution from audience members. This was the third such meeting held at the IADR for Editors, Associate Editors, Publishers and others with an interest in scientific publishing. A follow-up session will take place in Cape Town on 25 June 2014 as part of the annual IADR meeting. The transcript of the Seattle meeting is reproduced in this article. Where possible speakers are identified by name. At the first time of mention their role/position is also stated, thereafter only their name appears. We are grateful to Stephen Hancocks Ltd. for their generous sponsorship of this event. For those who were not able to attend the authors hope this article gives a flavour of the discussions and will encourage colleagues to attend future events. Involvement is open to Editors, Associate Editors, Publishers and others with an interest in scientific publishing. It is a very open group and all those with an interest will be welcome to join in. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Kyung Chun; Lee, Sang Joon
2011-06-01
The 14th International Symposium on Flow Visualization (ISFV14) was held in Daegu, Korea, on 21-24 June 2010. There were 304 participants from 17 countries. The state of the art in many aspects of flow visualization was presented and discussed, and a total of 243 papers from 19 countries were presented. Two special lectures and four invited lectures, 48 paper sessions and one poster session were held in five session rooms and in a lobby over four days. Among the paper sessions, those on 'biological flows', 'micro/nano fluidics', 'PIV/PTV' and 'compressible and sonic flows' received great attention from the participants of ISFV14. Special events included presentations of 'The Asanuma Award' and 'The Leonardo Da Vinci Award' to prominent contributors. Awards for photos and movies were given to three scientists for their excellence in flow visualizations. Sixteen papers were selected by the Scientific Committee of ISFV14. After the standard peer review process of this journal, six papers were finally accepted for publication. We wish to thank the editors of MST for making it possible to publish this special feature from ISFV14. We also thank the authors for their careful and insightful work and cooperation in the preparation of revised papers. It will be our pleasure if readers appreciate the hot topics in flow visualization research as a result of this special feature. We also hope that the progress in flow visualization will create new research fields. The 15th International Symposium on Flow Visualization will be held in Minsk, Belarus in 2012. We would like to express sincere thanks to the staff at IOP Publishing for their kind support.
Testing the Delayed Gamma Capability in MCNP6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weldon, Robert A.; Fensin, Michael L.; McKinney, Gregg W.
The mission of the Domestic Nuclear Detection Office is to quickly and reliably detect unauthorized attempts to import or transport special nuclear material for use against the United States. Developing detection equipment to meet this objective requires accurate simulation of both the detectable signature and detection mechanism. A delayed particle capability was initially added to MCNPX 2.6.A in 2005 to sample the radioactive fission product parents and emit decay particles resulting from the decay chain. To meet the objectives of detection scenario modeling, the capability was designed to sample a particular time for emitting particular multiplicity of a particular energy.more » Because the sampling process of selecting both time and energy is interdependent, to linearize the time and emission sampling, atom densities are computed at several discrete time steps, and the time-integrated production is computed by multiplying the atom density by the decay constant and time step size to produce a cumulative distribution function for sampling the emission time, energy, and multiplicity. The delayed particle capability was initially given a time-bin structure to help reasonably reproduce, from a qualitative sense, a fission benchmark by Beddingfield, which examined the delayed gamma emission. This original benchmark was only qualitative and did not contain the magnitudes of the actual measured data but did contain relative graphical representation of the spectra. A better benchmark with measured data was later provided by Hunt, Mozin, Reedy, Selpel, and Tobin at the Idaho Accelerator Center; however, because of the complexity of the benchmark setup, sizable systematic errors were expected in the modeling, and initial results compared to MCNPX 2.7.0 showed errors outside of statistical fluctuation. Presented in this paper is a more simplified approach to benchmarking, utilizing closed form analytic solutions to the granddaughter equations for particular sets of decay systems. We examine five different decay chains (two-stage decay to stable) and show the predictability of the MCNP6 delayed gamma feature. Results do show that while the default delayed gamma calculations available in the MCNP6 1.0 release can give accurate results for some isotopes (e.g., 137Ba), the percent differences between the closed form analytic solutions and the MCNP6 calculations were often >40% ( 28Mg, 28Al, 42K, 47Ca, 47Sc, 60Co). With the MCNP6 1.1 Beta release, the tenth entry on the DBCN card allows improved calculation within <5% as compared to the closed form analytic solutions for immediate parent emissions and transient equilibrium systems. While the tenth entry on the DBCN card for MCNP6 1.1 gives much better results for transient equilibrium systems and parent emissions in general, it does little to improve daughter emissions of secular equilibrium systems. Finally, hypotheses were presented as to why daughter emissions of secular equilibrium systems might be mispredicted in some cases and not in others.« less
NASA Astrophysics Data System (ADS)
Novoselov, Kostya S.; Pulizzi, Fabio
2018-06-01
Kostya S. Novoselov, professor of physics at the University of Manchester, UK, has been digging into the details of the life of an editor by asking Fabio Pulizzi, Chief Editor of Nature Nanotechnology, some inside information on his work.
Visual exploration and analysis of human-robot interaction rules
NASA Astrophysics Data System (ADS)
Zhang, Hui; Boyles, Michael J.
2013-01-01
We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots' responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming interfaces, information visualization, and visual data mining methods to facilitate designing, comprehending, and evaluating HRI interfaces.
Visual Acuity Reporting in Clinical Research Publications.
Tsou, Brittany C; Bressler, Neil M
2017-06-01
Visual acuity results in publications typically are reported in Snellen or non-Snellen formats or both. A study in 2011 suggested that many ophthalmologists do not understand non-Snellen formats, such as logarithm of the Minimum Angle of Resolution (logMAR) or Early Treatment Diabetic Retinopathy Study (ETDRS) letter scores. As a result, some journals, since at least 2013, have instructed authors to provide approximate Snellen equivalents next to non-Snellen visual acuity values. To evaluate how authors currently report visual acuity and whether they provide Snellen equivalents when their reports include non-Snellen formats. From November 21, 2016, through December 14, 2016, one reviewer evaluated visual acuity reporting among all articles published in 4 ophthalmology clinical journals from November 2015 through October 2016, including 3 of 4 journals that instructed authors to provide Snellen equivalents for visual acuity reported in non-Snellen formats. Frequency of formats of visual acuity reporting and frequency of providing Snellen equivalents when non-Snellen formats are given. The 4 journals reviewed had the second, fourth, fifth, and ninth highest impact factors for ophthalmology journals in 2015. Of 1881 articles reviewed, 807 (42.9%) provided a visual acuity measurement. Of these, 396 (49.1%) used only a Snellen format; 411 (50.9%) used a non-Snellen format. Among those using a non-Snellen format, 145 (35.3%) provided a Snellen equivalent while 266 (64.7%) provided only a non-Snellen format. More than half of all articles in 4 ophthalmology clinical journals fail to provide a Snellen equivalent when visual acuity is not in a Snellen format. Since many US ophthalmologists may not comprehend non-Snellen formats easily, these data suggest that editors and publishing staff should encourage authors to provide Snellen equivalents whenever visual acuity data are reported in a non-Snellen format to improve ease of understanding visual acuity measurements.
Plutonium Critical Mass Curve Comparison to Mass at Upper Subcritical Limit (USL) Using Whisper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise; Zhang, Ning
Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the MCNP ® Monte Carlo radiation transport package. Standard approaches to validation rely on the selection of benchmarks based upon expert judgment. Whisper uses sensitivity/uncertainty (S/U) methods to select relevant benchmarks to a particular application or set of applications being analyzed. Using these benchmarks, Whisper computes a calculational margin. Whisper attempts to quantify the margin of subcriticality (MOS) from errors in software and uncertainties in nuclear data. The combination of the Whisper-derived calculational margin and MOS comprise the baseline upper subcritical limit (USL), tomore » which an additional margin may be applied by the nuclear criticality safety analyst as appropriate to ensure subcriticality. A series of critical mass curves for plutonium, similar to those found in Figure 31 of LA-10860-MS, have been generated using MCNP6.1.1 and the iterative parameter study software, WORM_Solver. The baseline USL for each of the data points of the curves was then computed using Whisper 1.1. The USL was then used to determine the equivalent mass for plutonium metal-water system. ANSI/ANS-8.1 states that it is acceptable to use handbook data, such as the data directly from the LA-10860-MS, as it is already considered validated (Section 4.3 4) “Use of subcritical limit data provided in ANSI/ANS standards or accepted reference publications does not require further validation.”). This paper attempts to take a novel approach to visualize traditional critical mass curves and allows comparison with the amount of mass for which the k eff is equal to the USL (calculational margin + margin of subcriticality). However, the intent is to plot the critical mass data along with USL, not to suggest that already accepted handbook data should have new and more rigorous requirements for validation.« less
NASA Astrophysics Data System (ADS)
Kevin C. Burke, National Academy of Sciences/ National Research Council (NAS/NRC), assumed responsibilities as Editor in Chief of the American Geophysical Union (AGU) journal Tectonics at the beginning of 1990, taking over from Raymond A. Price, Queens University, Kingston, Ontario. Asger Berthelsen, University of Copenhagen, Denmark, continues as the European Editor, and Paul F. Hoffman, Geological Society of Canada, assumes the task of North American Editor. Tectonics is a joint publication of AGU and the European Geophysical Society.
Emotions under discussion: gender, status and communication in online collaboration.
Iosub, Daniela; Laniado, David; Castillo, Carlos; Fuster Morell, Mayo; Kaltenbrunner, Andreas
2014-01-01
Despite the undisputed role of emotions in teamwork, not much is known about the make-up of emotions in online collaboration. Publicly available repositories of collaboration data, such as Wikipedia editor discussions, now enable the large-scale study of affect and dialogue in peer production. We investigate the established Wikipedia community and focus on how emotion and dialogue differ depending on the status, gender, and the communication network of the [Formula: see text] editors who have written at least 100 comments on the English Wikipedia's article talk pages. Emotions are quantified using a word-based approach comparing the results of two predefined lexicon-based methods: LIWC and SentiStrength. We find that administrators maintain a rather neutral, impersonal tone, while regular editors are more emotional and relationship-oriented, that is, they use language to form and maintain connections to other editors. A persistent gender difference is that female contributors communicate in a manner that promotes social affiliation and emotional connection more than male editors, irrespective of their status in the community. Female regular editors are the most relationship-oriented, whereas male administrators are the least relationship-focused. Finally, emotional and linguistic homophily is prevalent: editors tend to interact with other editors having similar emotional styles (e.g., editors expressing more anger connect more with one another). Emotional expression and linguistic style in online collaboration differ substantially depending on the contributors' gender and status, and on the communication network. This should be taken into account when analyzing collaborative success, and may prove insightful to communities facing gender gap and stagnation in contributor acquisition and participation levels.
Microbial properties database editor tutorial
USDA-ARS?s Scientific Manuscript database
A Microbial Properties Database Editor (MPDBE) has been developed to help consolidate microbialrelevant data to populate a microbial database and support a database editor by which an authorized user can modify physico-microbial properties related to microbial indicators and pathogens. Physical prop...
Microbial Properties Database Editor Tutorial
A Microbial Properties Database Editor (MPDBE) has been developed to help consolidate microbial-relevant data to populate a microbial database and support a database editor by which an authorized user can modify physico-microbial properties related to microbial indicators and pat...
The Use of Uas for Rapid 3d Mapping in Geomatics Education
NASA Astrophysics Data System (ADS)
Teo, Tee-Ann; Tian-Yuan Shih, Peter; Yu, Sz-Cheng; Tsai, Fuan
2016-06-01
With the development of technology, UAS is an advance technology to support rapid mapping for disaster response. The aim of this study is to develop educational modules for UAS data processing in rapid 3D mapping. The designed modules for this study are focused on UAV data processing from available freeware or trial software for education purpose. The key modules include orientation modelling, 3D point clouds generation, image georeferencing and visualization. The orientation modelling modules adopts VisualSFM to determine the projection matrix for each image station. Besides, the approximate ground control points are measured from OpenStreetMap for absolute orientation. The second module uses SURE and the orientation files from previous module for 3D point clouds generation. Then, the ground point selection and digital terrain model generation can be archived by LAStools. The third module stitches individual rectified images into a mosaic image using Microsoft ICE (Image Composite Editor). The last module visualizes and measures the generated dense point clouds in CloudCompare. These comprehensive UAS processing modules allow the students to gain the skills to process and deliver UAS photogrammetric products in rapid 3D mapping. Moreover, they can also apply the photogrammetric products for analysis in practice.
Visualization tool for human-machine interface designers
NASA Astrophysics Data System (ADS)
Prevost, Michael P.; Banda, Carolyn P.
1991-06-01
As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.
MCNP simulation of a Theratron 780 radiotherapy unit.
Miró, R; Soler, J; Gallardo, S; Campayo, J M; Díez, S; Verdú, G
2005-01-01
A Theratron 780 (MDS Nordion) 60Co radiotherapy unit has been simulated with the Monte Carlo code MCNP. The unit has been realistically modelled: the cylindrical source capsule and its housing, the rectangular collimator system, both the primary and secondary jaws and the air gaps between the components. Different collimator openings, ranging from 5 x 5 cm2 to 20 x 20 cm2 (narrow and broad beams) at a source-surface distance equal to 80 cm have been used during the study. In the present work, we have calculated spectra as a function of field size. A study of the variation of the electron contamination of the 60Co beam has also been performed.
Neutron and photon shielding benchmark calculations by MCNP on the LR-0 experimental facility.
Hordósy, G
2005-01-01
In the framework of the REDOS project, the space-energy distribution of the neutron and photon flux has been calculated over the pressure vessel simulator thickness of the LR-0 experimental reactor, Rez, Czech Republic. The results calculated by the Monte Carlo code MCNP4C are compared with the measurements performed in the Nuclear Research Institute, Rez. The spectra have been measured at the barrel, in front of, inside and behind the pressure vessel in different configurations. The neutron measurements were performed in the energy range 0.1-10 MeV. This work has been done in the frame of the 5th Frame Work Programme of the European Community 1998-2002.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
Meet the Editors: JGR-Atmospheres
NASA Astrophysics Data System (ADS)
Kumar, Mohi
2006-04-01
Three scientists were newly appointed and one scientist was reappointed last year as editors of JGR-Atmospheres. The three new editors, John Austin, Jose D. Fuentes, and Ruth Lieberman, along with returning editor Colin O'Dowd, would like to see several changes made to the journal. ``JGR-Atmospheres is still regarded as the highest-quality atmospheric science journal, having perhaps one of the most stringent and rigorous review processes,'' said O'Dowd. ``However, there is still room for improvement.''
[The Chilean Association of Biomedical Journal Editors].
Reyes, H
2001-01-01
On September 29th, 2000, The Chilean Association of Biomedical Journal Editors was founded, sponsored by the "Comisión Nacional de Investigación Científica y Tecnológica (CONICYT)" (the Governmental Agency promoting and funding scientific research and technological development in Chile) and the "Sociedad Médica de Santiago" (Chilean Society of Internal Medicine). The Association adopted the goals of the World Association of Medical Editors (WAME) and therefore it will foster "cooperation and communication among Editors of Chilean biomedical journals; to improve editorial standards, to promote professionalism in medical editing through education, self-criticism and self-regulation; and to encourage research on the principles and practice of medical editing". Twenty nine journals covering a closely similar number of different biomedical sciences, medical specialties, veterinary, dentistry and nursing, became Founding Members of the Association. A Governing Board was elected: President: Humberto Reyes, M.D. (Editor, Revista Médica de Chile); Vice-President: Mariano del Sol, M.D. (Editor, Revista Chilena de Anatomía); Secretary: Anna María Prat (CONICYT); Councilors: Manuel Krauskopff, Ph.D. (Editor, Biological Research) and Maritza Rahal, M.D. (Editor, Revista de Otorrinolaringología y Cirugía de Cabeza y Cuello). The Association will organize a Symposium on Biomedical Journal Editing and will spread information stimulating Chilean biomedical journals to become indexed in international databases and in SciELO-Chile, the main Chilean scientific website (www.scielo.cl).
A scoping review of competencies for scientific editors of biomedical journals.
Galipeau, James; Barbour, Virginia; Baskin, Patricia; Bell-Syer, Sally; Cobey, Kelly; Cumpston, Miranda; Deeks, Jon; Garner, Paul; MacLehose, Harriet; Shamseer, Larissa; Straus, Sharon; Tugwell, Peter; Wager, Elizabeth; Winker, Margaret; Moher, David
2016-02-02
Biomedical journals are the main route for disseminating the results of health-related research. Despite this, their editors operate largely without formal training or certification. To our knowledge, no body of literature systematically identifying core competencies for scientific editors of biomedical journals exists. Therefore, we aimed to conduct a scoping review to determine what is known on the competency requirements for scientific editors of biomedical journals. We searched the MEDLINE®, Cochrane Library, Embase®, CINAHL, PsycINFO, and ERIC databases (from inception to November 2014) and conducted a grey literature search for research and non-research articles with competency-related statements (i.e. competencies, knowledge, skills, behaviors, and tasks) pertaining to the role of scientific editors of peer-reviewed health-related journals. We also conducted an environmental scan, searched the results of a previous environmental scan, and searched the websites of existing networks, major biomedical journal publishers, and organizations that offer resources for editors. A total of 225 full-text publications were included, 25 of which were research articles. We extracted a total of 1,566 statements possibly related to core competencies for scientific editors of biomedical journals from these publications. We then collated overlapping or duplicate statements which produced a list of 203 unique statements. Finally, we grouped these statements into seven emergent themes: (1) dealing with authors, (2) dealing with peer reviewers, (3) journal publishing, (4) journal promotion, (5) editing, (6) ethics and integrity, and (7) qualities and characteristics of editors. To our knowledge, this scoping review is the first attempt to systematically identify possible competencies of editors. Limitations are that (1) we may not have captured all aspects of a biomedical editor's work in our searches, (2) removing redundant and overlapping items may have led to the elimination of some nuances between items, (3) restricting to certain databases, and only French and English publications, may have excluded relevant publications, and (4) some statements may not necessarily be competencies. This scoping review is the first step of a program to develop a minimum set of core competencies for scientific editors of biomedical journals which will be followed by a training needs assessment, a Delphi exercise, and a consensus meeting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie
Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less
1990-01-01
J. Laurie Snell S. A. Amitsur, D. J. Saltman, and 2 Proceedings of the conference on G. B. Seligman , Editors integration, topology, and geometry in...Rational constructions of modules 17 Nonlinear partial differential equations. for simple Lie algebras, George B. Joel A. Smoller, Editor Seligman 18...number theory, Michael R. Stein and Linda Keen, Editor R. Keith Dennis, Editors 65 Logic and combinatorics, Stephen G. 84 Partition problems in
Editor's Choice Offered as a Service
NASA Astrophysics Data System (ADS)
Richman, Barbara T.
2010-06-01
Editor's Choice is now being offered as a service rather than on a subscription basis. As in the past, articles will be selected by collection editors with assistance from advisory panels. The selected articles will be listed on the AGU Web site (http://www.agu.org/pubs/journals/virtual/editors_choice/); these lists will be accessible to anyone. Those who are interested in reading the articles can access them through a personal or institutional subscription or can purchase them either individually or as part of a MultiChoice packet.
Pavlou, Andrew T.; Ji, Wei; Brown, Forrest B.
2016-01-23
Here, a proper treatment of thermal neutron scattering requires accounting for chemical binding through a scattering law S(α,β,T). Monte Carlo codes sample the secondary neutron energy and angle after a thermal scattering event from probability tables generated from S(α,β,T) tables at discrete temperatures, requiring a large amount of data for multiscale and multiphysics problems with detailed temperature gradients. We have previously developed a method to handle this temperature dependence on-the-fly during the Monte Carlo random walk using polynomial expansions in 1/T to directly sample the secondary energy and angle. In this paper, the on-the-fly method is implemented into MCNP6 andmore » tested in both graphite-moderated and light water-moderated systems. The on-the-fly method is compared with the thermal ACE libraries that come standard with MCNP6, yielding good agreement with integral reactor quantities like k-eigenvalue and differential quantities like single-scatter secondary energy and angle distributions. The simulation runtimes are comparable between the two methods (on the order of 5–15% difference for the problems tested) and the on-the-fly fit coefficients only require 5–15 MB of total data storage.« less
TRIPOLI-4® - MCNP5 ITER A-lite neutronic model benchmarking
NASA Astrophysics Data System (ADS)
Jaboulay, J.-C.; Cayla, P.-Y.; Fausser, C.; Lee, Y.-K.; Trama, J.-C.; Li-Puma, A.
2014-06-01
The aim of this paper is to present the capability of TRIPOLI-4®, the CEA Monte Carlo code, to model a large-scale fusion reactor with complex neutron source and geometry. In the past, numerous benchmarks were conducted for TRIPOLI-4® assessment on fusion applications. Experiments (KANT, OKTAVIAN, FNG) analysis and numerical benchmarks (between TRIPOLI-4® and MCNP5) on the HCLL DEMO2007 and ITER models were carried out successively. In this previous ITER benchmark, nevertheless, only the neutron wall loading was analyzed, its main purpose was to present MCAM (the FDS Team CAD import tool) extension for TRIPOLI-4®. Starting from this work a more extended benchmark has been performed about the estimation of neutron flux, nuclear heating in the shielding blankets and tritium production rate in the European TBMs (HCLL and HCPB) and it is presented in this paper. The methodology to build the TRIPOLI-4® A-lite model is based on MCAM and the MCNP A-lite model (version 4.1). Simplified TBMs (from KIT) have been integrated in the equatorial-port. Comparisons of neutron wall loading, flux, nuclear heating and tritium production rate show a good agreement between the two codes. Discrepancies are mainly included in the Monte Carlo codes statistical error.
Sogbadji, R B M; Abrefah, R G; Nyarko, B J B; Akaho, E H K; Odoi, H C; Attakorah-Birinkorang, S
2014-08-01
The americium-beryllium neutron irradiation facility at the National Nuclear Research Institute (NNRI), Ghana, was re-designed with four 20 Ci sources using Monte Carlo N-Particle (MCNP) code to investigate the maximum amount of flux that is produced by the combined sources. The results were compared with a single source Am-Be irradiation facility. The main objective was to enable us to harness the maximum amount of flux for the optimization of neutron activation analysis and to enable smaller sample sized samples to be irradiated. Using MCNP for the design construction and neutronic performance calculation, it was realized that the single-source Am-Be design produced a thermal neutron flux of (1.8±0.0007)×10(6) n/cm(2)s and the four-source Am-Be design produced a thermal neutron flux of (5.4±0.0007)×10(6) n/cm(2)s which is a factor of 3.5 fold increase compared to the single-source Am-Be design. The criticality effective, k(eff), of the single-source and the four-source Am-Be designs were found to be 0.00115±0.0008 and 0.00143±0.0008, respectively. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Banaee, Nooshin; Asgari, Sepideh; Nedaie, Hassan Ali
2018-07-01
The accuracy of penumbral measurements in radiotherapy is pivotal because dose planning computers require accurate data to adequately modeling the beams, which in turn are used to calculate patient dose distributions. Gamma knife is a non-invasive intracranial technique based on principles of the Leksell stereotactic system for open deep brain surgeries, invented and developed by Professor Lars Leksell. The aim of this study is to compare the penumbra widths of Leksell Gamma Knife model C and Gamma ART 6000. Initially, the structure of both systems were simulated by using Monte Carlo MCNP6 code and after validating the accuracy of simulation, beam profiles of different collimators were plotted. MCNP6 beam profile calculations showed that the penumbra values of Leksell Gamma knife model C and Gamma ART 6000 for 18, 14, 8 and 4 mm collimators are 9.7, 7.9, 4.3, 2.6 and 8.2, 6.9, 3.6, 2.4, respectively. The results of this study showed that since Gamma ART 6000 has larger solid angle in comparison with Gamma Knife model C, it produces better beam profile penumbras than Gamma Knife model C in the direct plane. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lapins, Janis; Guilliard, Nicole; Bernnat, Wolfgang; Buck, Arnulf
2017-09-01
During heavy ion irradiation therapy the patient has to be located exactly at the right position to make sure that the Bragg peak occurs in the tumour. The patient has to be moved in the range of millimetres to scan the ill tissue. For that reason a special table was developed which allows exact positioning. The electronic control can be located outside the surgery. But that has some disadvantage for the construction. To keep the system compact it would be much more comfortable to put the electronic control inside the surgery. As a lot of high energetic secondary particles are produced during the therapy causing a high dose in the room it is important to find positions with low dose rates. Therefore, investigations are needed where the electronic devices should be located to obtain a minimum of radiation, help to prevent the failure of sensitive devices. The dose rate was calculated for carbon ions with different initial energy and protons over the entire therapy room with Monte Carlo particle tracking using MCNP6. The types of secondary particles were identified and the dose rate for a thin silicon layer and an electronic mixture material was determined. In addition, the shielding effect of several selected material layers was calculated using MCNP6.
Rule-based support system for multiple UMLS semantic type assignments
Geller, James; He, Zhe; Perl, Yehoshua; Morrey, C. Paul; Xu, Julia
2012-01-01
Background When new concepts are inserted into the UMLS, they are assigned one or several semantic types from the UMLS Semantic Network by the UMLS editors. However, not every combination of semantic types is permissible. It was observed that many concepts with rare combinations of semantic types have erroneous semantic type assignments or prohibited combinations of semantic types. The correction of such errors is resource-intensive. Objective We design a computational system to inform UMLS editors as to whether a specific combination of two, three, four, or five semantic types is permissible or prohibited or questionable. Methods We identify a set of inclusion and exclusion instructions in the UMLS Semantic Network documentation and derive corresponding rule-categories as well as rule-categories from the UMLS concept content. We then design an algorithm adviseEditor based on these rule-categories. The algorithm specifies rules for an editor how to proceed when considering a tuple (pair, triple, quadruple, quintuple) of semantic types to be assigned to a concept. Results Eight rule-categories were identified. A Web-based system was developed to implement the adviseEditor algorithm, which returns for an input combination of semantic types whether it is permitted, prohibited or (in a few cases) requires more research. The numbers of semantic type pairs assigned to each rule-category are reported. Interesting examples for each rule-category are illustrated. Cases of semantic type assignments that contradict rules are listed, including recently introduced ones. Conclusion The adviseEditor system implements explicit and implicit knowledge available in the UMLS in a system that informs UMLS editors about the permissibility of a desired combination of semantic types. Using adviseEditor might help accelerate the work of the UMLS editors and prevent erroneous semantic type assignments. PMID:23041716
Emotions under Discussion: Gender, Status and Communication in Online Collaboration
Iosub, Daniela; Laniado, David; Castillo, Carlos; Fuster Morell, Mayo; Kaltenbrunner, Andreas
2014-01-01
Background Despite the undisputed role of emotions in teamwork, not much is known about the make-up of emotions in online collaboration. Publicly available repositories of collaboration data, such as Wikipedia editor discussions, now enable the large-scale study of affect and dialogue in peer production. Methods We investigate the established Wikipedia community and focus on how emotion and dialogue differ depending on the status, gender, and the communication network of the editors who have written at least 100 comments on the English Wikipedia's article talk pages. Emotions are quantified using a word-based approach comparing the results of two predefined lexicon-based methods: LIWC and SentiStrength. Principal Findings We find that administrators maintain a rather neutral, impersonal tone, while regular editors are more emotional and relationship-oriented, that is, they use language to form and maintain connections to other editors. A persistent gender difference is that female contributors communicate in a manner that promotes social affiliation and emotional connection more than male editors, irrespective of their status in the community. Female regular editors are the most relationship-oriented, whereas male administrators are the least relationship-focused. Finally, emotional and linguistic homophily is prevalent: editors tend to interact with other editors having similar emotional styles (e.g., editors expressing more anger connect more with one another). Conclusions/Significance Emotional expression and linguistic style in online collaboration differ substantially depending on the contributors' gender and status, and on the communication network. This should be taken into account when analyzing collaborative success, and may prove insightful to communities facing gender gap and stagnation in contributor acquisition and participation levels. PMID:25140870
Technical editing and the effective communication of scientific results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pieper, G.W.; Picologlou, S.M.
1996-05-01
Communication of scientific results--whether for professional journals, poster sessions, oral presentations, or the popular press--is an essential part of any scientific investigation. The technical editor plays an important rolein ensuring that scientists express their results correctly and effectively. Technical editing comprises far more than simple proofreading. The editor`s tasks may range from restructuring whole parpagrphs and suggesting improved graphical aids to writing abstracts and preparing first drafts of proposals. The technical editor works closely with scientists to present complex ideas to differentaudiences, including fellow scentists, funding agencies, and the general public. New computer technologyhas also involved the technical editor notmore » only with on-line editing but also with preparing CD ROMs and World Wide Web pages.« less
Alfonso, Fernando; Adamyan, Karlen; Artigou, Jean Yves; Aschermann, Michael; Boehm, Michael; Buendia, Alfonso; Chu, Pao Hsien; Cohen, Ariel; Cas, Livio Dei; Dilic, Mirza; Doubell, Anton; Echeverri, Dario; Enç, Nuray; Ferreira-González, Ignacio; Filipiak, Krzysztof J; Flammer, Andreas; Fleck, Eckart; Gatzov, Plamen; Ginghina, Carmen; Goncalves, Lino; Haouala, Habib; Hassanein, Mahmoud; Heusch, Gerd; Huber, Kurt; Hulín, Ivan; Ivanusa, Mario; Krittayaphong, Rungroj; Lau, Chu Pak; Marinskis, Germanas; Mach, François; Moreira, Luiz Felipe; Nieminen, Tuomo; Oukerraj, Latifa; Perings, Stefan; Pierard, Luc; Potpara, Tatjana; Reyes-Caorsi, Walter; Rim, Se Joong; Rødevand, Olaf; Saade, Georges; Sander, Mikael; Shlyakhto, Evgeny; Timuralp, Bilgin; Tousoulis, Dimitris; Ural, Dilek; Piek, J J; Varga, Albert; Lüscher, Thomas F
2017-06-01
The International Committee of Medical Journal Editors (ICMJE) provides recommendations to improve the editorial standards and scientific quality of biomedical journals. These recommendations range from uniform technical requirements to more complex and elusive editorial issues including ethical aspects of the scientific process. Recently, registration of clinical trials, conflicts of interest disclosure, and new criteria for authorship -emphasizing the importance of responsibility and accountability-, have been proposed. Last year, a new editorial initiative to foster sharing of clinical trial data was launched. This review discusses this novel initiative with the aim of increasing awareness among readers, investigators, authors and editors belonging to the Editors´ Network of the European Society of Cardiology.
Alfonso, Fernando; Adamyan, Karlen; Artigou, Jean-Yves; Aschermann, Michael; Boehm, Michael; Buendia, Alfonso; Chu, Pao-Hsien; Cohen, Ariel; Cas, Livio Dei; Dilic, Mirza; Doubell, Anton; Echeverri, Dario; Enç, Nuray; Ferreira-González, Ignacio; Filipiak, Krzysztof J.; Flammer, Andreas; Fleck, Eckart; Gatzov, Plamen; Ginghina, Carmen; Goncalves, Lino; Haouala, Habib; Hassanein, Mahmoud; Heusch, Gerd; Huber, Kurt; Hulín, Ivan; Ivanusa, Mario; Krittayaphong, Rungroj; Lau, Chu-Pak; Marinskis, Germanas; Mach, François; Moreira, Luiz Felipe; Nieminen, Tuomo; Oukerraj, Latifa; Perings, Stefan; Pierard, Luc; Potpara, Tatjana; Reyes-Caorsi, Walter; Rim, Se-Joong; Rødevand, Olaf; Saade, Georges; Sander, Mikael; Shlyakhto, Evgeny; Timuralp, Bilgin; Tousoulis, Dimitris; Ural, Dilek; Piek, J. J.; Varga, Albert; Lüscher, Thomas F.
2017-01-01
The International Committee of Medical Journal Editors (ICMJE) provides recommendations to improve the editorial standards and scientific quality of biomedical journals. These recommendations range from uniform technical requirements to more complex and elusive editorial issues including ethical aspects of the scientific process. Recently, registration of clinical trials, conflicts of interest disclosure, and new criteria for authorship - emphasizing the importance of responsibility and accountability-, have been proposed. Last year, a new editorial initiative to foster sharing of clinical trial data was launched. This review discusses this novel initiative with the aim of increasing awareness among readers, investigators, authors and editors belonging to the Editors´ Network of the European Society of Cardiology. PMID:28591318
An integrated pipeline to create and experience compelling scenarios in virtual reality
NASA Astrophysics Data System (ADS)
Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina
2011-03-01
One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are being presented in the virtual-reality systems. Users can quickly prototype the basic scene using the designer and the editor on a control workstation. More elements can then be introduced into the scene from both the editor and the virtual-reality display. In this manner, users are able to gradually increase the complexity of the scenario with immediate feedback. The main use of this pipeline is the rapid development of scenarios for human-factors studies. However, it is applicable in a much more general context.
Impact of thorium based molten salt reactor on the closure of the nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Jaradat, Safwan Qasim Mohammad
Molten salt reactor (MSR) is one of six reactors selected by the Generation IV International Forum (GIF). The liquid fluoride thorium reactor (LFTR) is a MSR concept based on thorium fuel cycle. LFTR uses liquid fluoride salts as a nuclear fuel. It uses 232Th and 233U as the fertile and fissile materials, respectively. Fluoride salt of these nuclides is dissolved in a mixed carrier salt of lithium and beryllium (FLiBe). The objective of this research was to complete feasibility studies of a small commercial thermal LFTR. The focus was on neutronic calculations in order to prescribe core design parameter such as core size, fuel block pitch (p), fuel channel radius, fuel path, reflector thickness, fuel salt composition, and power. In order to achieve this objective, the applicability of Monte Carlo N-Particle Transport Code (MCNP) to MSR modeling was verified. Then, a prescription for conceptual small thermal reactor LFTR and relevant calculations were performed using MCNP to determine the main neutronic parameters of the core reactor. The MCNP code was used to study the reactor physics characteristics for the FUJI-U3 reactor. The results were then compared with the results obtained from the original FUJI-U3 using the reactor physics code SRAC95 and the burnup analysis code ORIPHY2. The results were comparable with each other. Based on the results, MCNP was found to be a reliable code to model a small thermal LFTR and study all the related reactor physics characteristics. The results of this study were promising and successful in demonstrating a prefatory small commercial LFTR design. The outcome of using a small core reactor with a diameter/height of 280/260 cm that would operate for more than five years at a power level of 150 MWth was studied. The fuel system 7LiF - BeF2 - ThF4 - UF4 with a (233U/ 232Th) = 2.01 % was the candidate fuel for this reactor core.
Peer reviews and the role of a journal editor
USDA-ARS?s Scientific Manuscript database
Obtaining peer reviews for manuscripts submitted to scientific journals is becoming increasingly difficult. Changes to the system are necessary, and editors must cultivate and maintain a solid base of reviewers to help evaluate journal submissions. This article outlines some steps editors can and sh...
Training the Technical Editor.
ERIC Educational Resources Information Center
Cathcart, Margaret E.
The demand for skilled technical editors is growing as society places increasing emphasis on receiving accurate, concise, and complete technical data. Since many organizations do not have inhouse programs for training technical editors, a need exists to provide inexperienced people with basic editing skills. One organization has developed two…
Entering the 60th year of Acta Astronautica
NASA Astrophysics Data System (ADS)
Chang, Yi-Wei; Chern, Jeng-Shing; Marec, Jean-Pierre
2014-04-01
The Acta Astronautica Journal was firstly published in 1955 as the official Journal of the International Astronautical Federation (IAF) with the title Astronautica Acta. It is entering its 60th year in 2014. In 1962, the Astronautica Acta became the official Journal of the International Academy of Astronautics (IAA) established in 1960. A total of 18 volumes had been published from 1955 to 1973 under the leadership of three Editor-in-Chiefs: F. Hecht, Theodore von Karman, and Martin Summerfield. In 1974, A.K. Oppenheim became the new Editor-in-Chief and several evolved changes were performed including change of the title to Acta Astronautica (for grammatical correctness), cover page change, and format change. From 1974 to 2010, another three Editor-in-Chiefs led the journal with 67 volumes published. They were A.K. Oppenheim, Jean-Pierre Marec, and Rupert Gerzer. The current Editor-in-Chief Jeng-Shing Chern (Rock) took over the job from 2011. Total pages and articles published in 2012 are 3586 and 356, respectively. Currently, the Acta Astronautica Editorial Board consists of one Editor-in-Chief, 15 Co-Editors, one Managing Editor and one Honorary Editor-in-Chief (Jean-Pierre Marec). After 59 years, the Acta Astronautica has become a well-known journal worldwide. Its current rank and impact factor are 7/63 and 0.701, respectively. This paper presents some of the details as well as new strategies and steps. In particular, supports from the IAA Academicians are mandatory and most welcome.
Quantifying the effect of editor-author relations on manuscript handling times.
Sarigöl, Emre; Garcia, David; Scholtes, Ingo; Schweitzer, Frank
2017-01-01
In this article we study to what extent the academic peer review process is influenced by social relations between the authors of a manuscript and the editor handling the manuscript. Taking the open access journal PlosOne as a case study, our analysis is based on a data set of more than 100,000 articles published between 2007 and 2015. Using available data on handling editor, submission and acceptance time of manuscripts, we study the question whether co-authorship relations between authors and the handling editor affect the manuscript handling time , i.e. the time taken between the submission and acceptance of a manuscript. Our analysis reveals (1) that editors handle papers co-authored by previous collaborators significantly more often than expected at random, and (2) that such prior co-author relations are significantly related to faster manuscript handling. Addressing the question whether these shorter manuscript handling times can be explained by the quality of publications, we study the number of citations and downloads which accepted papers eventually accumulate. Moreover, we consider the influence of additional (social) factors, such as the editor's experience, the topical similarity between authors and editors, as well as reciprocal citation relations between authors and editors. Our findings show that, even when correcting for other factors like time, experience, and performance, prior co-authorship relations have a large and significant influence on manuscript handling times, speeding up the editorial decision on average by 19 days.
NASA Technical Reports Server (NTRS)
Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.
1991-01-01
The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.
NASA Astrophysics Data System (ADS)
Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.
1991-03-01
The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.
How Non-Daily Editors Describe Status and Function of Editorial Pages.
ERIC Educational Resources Information Center
Hynds, Ernest C.; Martin, Charles H.
1979-01-01
Results of a survey of 359 editors of nondaily newspapers indicates that most nondaily editors see their editorials and editorial pages as important segments of their newspapers and believe they can use them to help influence readers, particularly on local issues. (Author/GT)
Letter to the editor of TAAP, in response to letter from Anders et al.
To the Editor, Toxicology and Applied Pharmacology: We would like to address the letter to the editor submitted by Anders et al. regarding the substantive issues raised regarding our paper "Evaluation of two different metabolic hypotheses for dichloromethane toxicity using physi...
Abbott, J Haxby
2016-08-01
In response to the growth of JOSPT, Editor-in-Chief J. Haxby Abbott introduces 3 new Associate Editors to the JOSPT Editorial Board, and announces the promotion of 1 outstanding Editorial Board member to an Editor role. J Orthop Sports Phys Ther 2016;46(8):610-612. doi:10.2519/jospt.2016.0111.
Professional Editing Strategies Used by Six Editors
ERIC Educational Resources Information Center
Bisaillon, Jocelyne
2007-01-01
Identifying the approach used by those revision experts par excellence--that is, professional editors--should enable researchers to better grasp the revision process. To further explore this hypothesis, the author conducted research among professional editors, six of whom she filmed as they engaged in their practice. An analysis of their work…
ERIC Educational Resources Information Center
Jeffers, Dennis W.
A study was undertaken of specialized magazine editors' perceptions of audience characteristics as well as the perceived role of their publications. Specifically, the study examines the relationship between the editors' perceptions of reader problem recognition, level of involvement, constraint recognition, and possession of reference criteria and…
Editorial and Broadcasting Careers.
ERIC Educational Resources Information Center
Broido, Arnold; And Others
1982-01-01
Describes the jobs of the music publisher and editor, music magazine and book editor, film music editor, and music critic. Educational requirements, job availability, and the advantages and disadvantages of each are discussed. A tear-out chart of ten music career areas, listing salaries and personal and educational qualifications, is included. (AM)
Generating the Field: The Role of Editors in Disciplinary Formation
ERIC Educational Resources Information Center
Selfe, Cynthia; Villanueva, Victor; Parks, Steve
2017-01-01
In the following conversation, conducted asynchronously through email, three current and former editors discuss the role of publishing in creating a disciplinary identity. Speaking from the academic (Villanueva), digital (Selfe), and community (Parks), and, often crossing these three categories, the editors discuss how the field has failed to…
STEVE -- User Guide and Reference Manual
NASA Astrophysics Data System (ADS)
Fish, Adrian
This document describes an extended version of the EVE editor that has been tailored to the general Starlink user's requirements. This extended editor is STarlink Eve or STEve, and this document (along with it's introductory companion SUN/125) describes this editor, and offers additional help, advice and tips on general EVE usage.
Publishing and Journalism Careers
ERIC Educational Resources Information Center
Reed, Alfred; And Others
1977-01-01
If you like to work with words and notational symbols--or with describing, selecting, managing, and distributing the words and music of other people--then journalism or publishing as a whole may be your bailiwick. Describes the positions of music editor, music publisher, magazine/book editor, music critic, and freelance music writer. (Editor/RK)
Alfonso, Fernando; Adamyan, Karlen; Artigou, Jean-Yves; Aschermann, Michael; Boehm, Michael; Buendia, Alfonso; Chu, Pao-Hsien; Cohen, Ariel; Cas, Livio Dei; Dilic, Mirza; Doubell, Anton; Echeverri, Dario; Enç, Nuray; Ferreira-González, Ignacio; Filipiak, Krzysztof J; Flammer, Andreas; Fleck, Eckart; Gatzov, Plamen; Ginghina, Carmen; Goncalves, Lino; Haouala, Habib; Hassanein, Mahmoud; Heusch, Gerd; Huber, Kurt; Hulín, Ivan; Ivanusa, Mario; Krittayaphong, Rungroj; Lau, Chu-Pak; Marinskis, Germanas; Mach, François; Moreira, Luiz Felipe; Nieminen, Tuomo; Oukerraj, Latifa; Perings, Stefan; Pierard, Luc; Potpara, Tatjana; Reyes-Caorsi, Walter; Rim, Se-Joong; Rødevand, Olaf; Saade, Georges; Sander, Mikael; Shlyakhto, Evgeny; Timuralp, Bilgin; Tousoulis, Dimitris; Ural, Dilek; Piek, J J; Varga, Albert; Lüscher, Thomas F
The International Committee of Medical Journal Editors (ICMJE) provides recommendations to improve the editorial standards and scientific quality of biomedical journals. These recommendations range from uniform technical requirements to more complex and elusive editorial issues including ethical aspects of the scientific process. Recently, registration of clinical trials, conflicts of interest disclosure, and new criteria for authorship - emphasizing the importance of responsibility and accountability -, have been proposed. Last year, a new editorial initiative to foster sharing of clinical trial data was launched. This review discusses this novel initiative with the aim of increasing awareness among readers, investigators, authors and editors belonging to the Editors' Network of the European Society of Cardiology. Copyright © 2017. Publicado por Masson Doyma México S.A.
ZED- A LINE EDITOR FOR THE DEC VAX
NASA Technical Reports Server (NTRS)
Scott, P. J.
1994-01-01
The ZED editor for the DEC VAX is a simple, yet powerful line editor for text, program source code, and non-binary data. Line editors can be superior to screen editors in some cases, such as executing complex multiple or conditional commands, or editing via slow modem lines. ZED excels in the area of text processing by using procedure files. For example, such procedures can reformat a file of addresses or remove all comment lines from a FORTRAN program. In addition to command files, ZED also features versatile search qualifiers, global changes, conditionals, on-line help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. The ZED editor was originally developed at Cambridge University in London and has been continuously enhanced since 1976. Users of the Cambridge implementation have devised such elaborate ZED procedures as chess games, calculators, and programs for evaluating Pi. This implementation of ZED strives to maintain the characteristics of the Cambridge editor. A complete ZED manual is included on the tape. ZED is written entirely in C for either batch or interactive execution on the DEC VAX under VMS 4.X and requires 80,896 bytes of memory. This program was released in 1988 and updated in 1989.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pantelias, M.; Volmert, B.; Caruso, S.
MCNP models of all Swiss Nuclear Power Plants have been developed by the National Cooperative for the Disposal of Radioactive Waste (Nagra), in collaboration with the utilities and ETH Zurich, for the 2011 decommissioning cost study. The estimation of the residual radionuclide inventories and corresponding activity levels of irradiated structures and components following the NPP shut-down is of crucial importance for the planning of the dismantling process, the waste packaging concept and, consequently, for the estimation of the decommissioning costs. Based on NPP specific data, the neutron transport simulations lead to the best yet knowledge of the neutron spectra necessarymore » for the ensuing activation calculations. In this paper, the modeling concept towards the MCNP-NPPs is outlined and the resulting flux distribution maps are presented. (authors)« less
EBR-II Static Neutronic Calculations by PHISICS / MCNP6 codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paolo Balestra; Carlo Parisi; Andrea Alfonsi
2016-02-01
The International Atomic Energy Agency (IAEA) launched a Coordinated Research Project (CRP) on the Shutdown Heat Removal Tests (SHRT) performed in the '80s at the Experimental fast Breeder Reactor EBR-II, USA. The scope of the CRP is to improve and validate the simulation tools for the study and the design of the liquid metal cooled fast reactors. Moreover, training of the next generation of fast reactor analysts is being also considered the other scope of the CRP. In this framework, a static neutronic model was developed, using state-of-the art neutron transport codes like SCALE/PHISICS (deterministic solution) and MCNP6 (stochastic solution).more » Comparison between both solutions is briefly illustrated in this summary.« less
ERIC Educational Resources Information Center
Kopacz, Marek S.; Bajka-Kopacz, Aleksandra
2012-01-01
Almost all teenage magazines invite readers to submit questions concerning relationships, published as letters to the editor, popularly called "advice columns," often containing explicit questions about sexuality. This study aims to examine, firstly, how themes related to sexual initiation are presented in letters to the editor published…
Digital Alteration of Photographs in Magazines: An Examination of the Ethics.
ERIC Educational Resources Information Center
Reaves, Shiela
A study examined magazine editors' views of some of the ethical considerations posed by digital alteration of photographs. Subjects, 12 consumer news and specialty magazine editors, were interviewed by telephone and asked a series of questions concerning the ethics of digitally manipulating photographs. Results indicated that magazine editors were…
"Clones," Codes, and Conflicts of Interest in Cartooning: Cartoonists and Editors Look at Ethics.
ERIC Educational Resources Information Center
Riffe, Daniel; And Others
A study examined differences between political cartoonists and op-ed page editors on both traditional ethical issues (such as conflicts of interest) and the special, style-related concerns of editorial cartoonists. Hypotheses proposed were that editors and cartoonists (1) would condemn "cloning" or copying, reflecting an ethical…
29 CFR 793.11 - Combination announcer, news editor and chief engineer.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 3 2012-07-01 2012-07-01 false Combination announcer, news editor and chief engineer. 793...)(9) OF THE FAIR LABOR STANDARDS ACT Requirements for Exemption § 793.11 Combination announcer, news... as a news editor. In such cases, the primary employment test under the section 13(b)(9) exemption...
29 CFR 793.11 - Combination announcer, news editor and chief engineer.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 3 2014-07-01 2014-07-01 false Combination announcer, news editor and chief engineer. 793...)(9) OF THE FAIR LABOR STANDARDS ACT Requirements for Exemption § 793.11 Combination announcer, news... as a news editor. In such cases, the primary employment test under the section 13(b)(9) exemption...
29 CFR 793.11 - Combination announcer, news editor and chief engineer.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 3 2013-07-01 2013-07-01 false Combination announcer, news editor and chief engineer. 793...)(9) OF THE FAIR LABOR STANDARDS ACT Requirements for Exemption § 793.11 Combination announcer, news... as a news editor. In such cases, the primary employment test under the section 13(b)(9) exemption...
The Introductory Psychology Textbook Market: Perceptions of Authors and Editors.
ERIC Educational Resources Information Center
Griggs, Richard A.; Jackson, Sherri L.
1989-01-01
Surveys psychology textbook authors and editors on their perceptions of the introductory psychology textbook market. Finds that the textbook market is divided into three levels according to quality, and that authors and editors are not familiar with most textbooks. Notes that the growth of used book companies has adversely affected the market.…
A Lesson for Instructors: Top 10 Copy-Editing Skills.
ERIC Educational Resources Information Center
Auman, Ann
1995-01-01
Presents results of a survey of 164 newspaper editors regarding which skills they believe are crucial for entry-level copy editors to know, and in which areas they see the most deficiencies. Notes that the skills identified reflect the changing duties of the copy editor and the increasing complexities of the job. (SR)
Newspaper Ethics and Managing Editors: The Evolution of APME's Code.
ERIC Educational Resources Information Center
De Mott, John
A review of the 42-year development of the professional code of ethics of the Associated Press Managing Editors (APME) demonstrates an effort to elevate newspaper ethical standards around the country. Following the example of the American Society of Newspaper Editors in establishing its "Canons of Journalism" in 1923, the APME formed a…
NASA Astrophysics Data System (ADS)
Audoly, Basile; Castañeda, Pedro Ponte; Kuhl, Ellen; Niordson, Christian; Sharma, Pradeep; Gao, Huajian
2016-02-01
After 12 years of distinguished service, Kaushik Bhattacharya has decided to step down as co-editor of the Journal of the Mechanics and Physics of Solids. A new editorial team, with Huajian Gao as editor and Basile Audoly, Pedro Ponte Castañeda, Ellen Kuhl, Christian Niordson and Pradeep Sharma as Associate Editors, will take over as of January 1, 2016.
Biosecurity and the review and publication of dual-use research of concern.
Patrone, Daniel; Resnik, David; Chin, Lisa
2012-09-01
Dual-use research of concern (DURC) is scientific research with significant potential for generating information that could be used to harm national security, the public health, or the environment. Editors responsible for journal policies and publication decisions play a vital role in ensuring that effective safeguards exist to cope with the risks of publishing scientific research with dual-use implications. We conducted an online survey of 127 chief editors of life science journals in 27 countries to examine their attitudes toward and experience with the review and publication of dual-use research of concern. Very few editors (11) had experience with biosecurity review, and no editor in our study reported having ever refused a submission on biosecurity grounds. Most respondents (74.8%) agreed that editors have a responsibility to consider biosecurity risks during the review process, but little consensus existed among editors on how to handle specific issues in the review and publication of research with potential dual-use implications. More work is needed to establish consensus on standards for the review and publication of dual-use research of concern in life science journals.
Biosecurity and the Review and Publication of Dual-Use Research of Concern
Resnik, David; Chin, Lisa
2012-01-01
Dual-use research of concern (DURC) is scientific research with significant potential for generating information that could be used to harm national security, the public health, or the environment. Editors responsible for journal policies and publication decisions play a vital role in ensuring that effective safeguards exist to cope with the risks of publishing scientific research with dual-use implications. We conducted an online survey of 127 chief editors of life science journals in 27 countries to examine their attitudes toward and experience with the review and publication of dual-use research of concern. Very few editors (11) had experience with biosecurity review, and no editor in our study reported having ever refused a submission on biosecurity grounds. Most respondents (74.8%) agreed that editors have a responsibility to consider biosecurity risks during the review process, but little consensus existed among editors on how to handle specific issues in the review and publication of research with potential dual-use implications. More work is needed to establish consensus on standards for the review and publication of dual-use research of concern in life science journals. PMID:22871221
Designing Epigenome Editors: Considerations of Biochemical and Locus Specificities.
Sen, Dilara; Keung, Albert J
2018-01-01
The advent of locus-specific protein recruitment technologies has enabled a new class of studies in chromatin biology. Epigenome editors enable biochemical modifications of chromatin at almost any specific endogenous locus. Their locus specificity unlocks unique information including the functional roles of distinct modifications at specific genomic loci. Given the growing interest in using these tools for biological and translational studies, there are many specific design considerations depending on the scientific question or clinical need. Here we present and discuss important design considerations and challenges regarding the biochemical and locus specificities of epigenome editors. These include how to account for the complex biochemical diversity of chromatin; control for potential interdependency of epigenome editors and their resultant modifications; avoid sequestration effects; quantify the locus specificity of epigenome editors; and improve locus specificity by considering concentration, affinity, avidity, and sequestration effects.
Liu, Jessica J; Bell, Chaim M; Matelski, John J; Detsky, Allan S; Cram, Peter
2017-10-26
Objective To estimate financial payments from industry to US journal editors. Design Retrospective observational study. Setting 52 influential (high impact factor for their specialty) US medical journals from 26 specialties and US Open Payments database, 2014. Participants 713 editors at the associate level and above identified from each journal's online masthead. Main outcome measures All general payments (eg, personal income) and research related payments from pharmaceutical and medical device manufacturers to eligible physicians in 2014. Percentages of editors receiving payments and the magnitude of such payments were compared across journals and by specialty. Journal websites were also reviewed to determine if conflict of interest policies for editors were readily accessible. Results Of 713 eligible editors, 361 (50.6%) received some (>$0) general payments in 2014, and 139 (19.5%) received research payments. The median general payment was $11 (£8; €9) (interquartile range $0-2923) and the median research payment was $0 ($0-0). The mean general payment was $28 136 (SD $415 045), and the mean research payment was $37 963 (SD $175 239). The highest median general payments were received by journal editors from endocrinology ($7207, $0-85 816), cardiology ($2664, $0-12 912), gastroenterology ($696, $0-20 002), rheumatology ($515, $0-14 280), and urology ($480, $90-669). For high impact general medicine journals, median payments were $0 ($0-14). A review of the 52 journal websites revealed that editor conflict of interest policies were readily accessible (ie, within five minutes) for 17/52 (32.7%) of journals. Conclusions Industry payments to journal editors are common and often large, particularly for certain subspecialties. Journals should consider the potential impact of such payments on public trust in published research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Bell, Chaim M; Matelski, John J; Detsky, Allan S; Cram, Peter
2017-01-01
Objective To estimate financial payments from industry to US journal editors. Design Retrospective observational study. Setting 52 influential (high impact factor for their specialty) US medical journals from 26 specialties and US Open Payments database, 2014. Participants 713 editors at the associate level and above identified from each journal’s online masthead. Main outcome measures All general payments (eg, personal income) and research related payments from pharmaceutical and medical device manufacturers to eligible physicians in 2014. Percentages of editors receiving payments and the magnitude of such payments were compared across journals and by specialty. Journal websites were also reviewed to determine if conflict of interest policies for editors were readily accessible. Results Of 713 eligible editors, 361 (50.6%) received some (>$0) general payments in 2014, and 139 (19.5%) received research payments. The median general payment was $11 (£8; €9) (interquartile range $0-2923) and the median research payment was $0 ($0-0). The mean general payment was $28 136 (SD $415 045), and the mean research payment was $37 963 (SD $175 239). The highest median general payments were received by journal editors from endocrinology ($7207, $0-85 816), cardiology ($2664, $0-12 912), gastroenterology ($696, $0-20 002), rheumatology ($515, $0-14 280), and urology ($480, $90-669). For high impact general medicine journals, median payments were $0 ($0-14). A review of the 52 journal websites revealed that editor conflict of interest policies were readily accessible (ie, within five minutes) for 17/52 (32.7%) of journals. Conclusions Industry payments to journal editors are common and often large, particularly for certain subspecialties. Journals should consider the potential impact of such payments on public trust in published research. PMID:29074628
NASA Technical Reports Server (NTRS)
Chimiak, Reine; Harris, Bernard; Williams, Phillip
2013-01-01
Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.
[Honesty and good faith: two cornerstones in the ethics of biomedical publications].
Reyes, Humberto
2007-04-01
The editors of medical journals should take the steps necessary to assure its readers that the contents of their publications are based in true data, that they are original and fulfill the ethical rules of biomedical and clinical research, including its reporting. This editors role has become increasingly difficult since the pressure to publish scientific papers is progressively stimulated by the role that those papers play in curricula vitae when the authors apply for university positions, academic promotions, research grants and for their personal prestige. As a consequence, increasing instances of misconduct in scientific publications are detected. Some cases are noticed during the editorial process, mostly when peer reviewers identify redundant publications or plagiarism. Other cases are denounced after a manuscript was published. It is the editors duty to verify the misconduct, request an explanation from the authors and, if their answer is unsatisfactory, report the problem to the institutional authorities supporting the authors. The editors should denounce the situation in a forthcoming issue of the journal. Universities should enforce the teaching of ethical rules that govern the report of scientific information. Revista Médica de Chile follows recommendations given by the International Committee of Medical Journal Editors, the World Association of Medical Editors and other groups, but honesty and good faith in all the actors involved in the process of biomedical publications (authors, reviewers, editors) remain the cornerstones of scientific good behavior.
Benchmarking of MCNP for calculating dose rates at an interim storage facility for nuclear waste.
Heuel-Fabianek, Burkhard; Hille, Ralf
2005-01-01
During the operation of research facilities at Research Centre Jülich, Germany, nuclear waste is stored in drums and other vessels in an interim storage building on-site, which has a concrete shielding at the side walls. Owing to the lack of a well-defined source, measured gamma spectra were unfolded to determine the photon flux on the surface of the containers. The dose rate simulation, including the effects of skyshine, using the Monte Carlo transport code MCNP is compared with the measured dosimetric data at some locations in the vicinity of the interim storage building. The MCNP data for direct radiation confirm the data calculated using a point-kernel method. However, a comparison of the modelled dose rates for direct radiation and skyshine with the measured data demonstrate the need for a more precise definition of the source. Both the measured and the modelled dose rates verified the fact that the legal limits (<1 mSv a(-1)) are met in the area outside the perimeter fence of the storage building to which members of the public have access. Using container surface data (gamma spectra) to define the source may be a useful tool for practical calculations and additionally for benchmarking of computer codes if the discussed critical aspects with respect to the source can be addressed adequately.
Zhang, Xiaomin; Xie, Xiangdong; Cheng, Jie; Ning, Jing; Yuan, Yong; Pan, Jie; Yang, Guoshan
2012-01-01
A set of conversion coefficients from kerma free-in-air to the organ absorbed dose for external photon beams from 10 keV to 10 MeV are presented based on a newly developed voxel mouse model, for the purpose of radiation effect evaluation. The voxel mouse model was developed from colour images of successive cryosections of a normal nude male mouse, in which 14 organs or tissues were segmented manually and filled with different colours, while each colour was tagged by a specific ID number for implementation of mouse model in Monte Carlo N-particle code (MCNP). Monte Carlo simulation with MCNP was carried out to obtain organ dose conversion coefficients for 22 external monoenergetic photon beams between 10 keV and 10 MeV under five different irradiation geometries conditions (left lateral, right lateral, dorsal-ventral, ventral-dorsal, and isotropic). Organ dose conversion coefficients were presented in tables and compared with the published data based on a rat model to investigate the effect of body size and weight on the organ dose. The calculated and comparison results show that the organ dose conversion coefficients varying the photon energy exhibits similar trend for most organs except for the bone and skin, and the organ dose is sensitive to body size and weight at a photon energy approximately <0.1 MeV.
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.; ...
2016-10-01
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
Cagnazzo, M; Borio di Tigliole, A; Böck, H; Villa, M
2018-05-01
Aim of this work was the detection of fission products activity distribution along the axial dimension of irradiated fuel elements (FEs) at the TRIGA Mark II research reactor of the Technische Universität (TU) Wien. The activity distribution was measured by means of a customized fuel gamma scanning device, which includes a vertical lifting system to move the fuel rod along its vertical axis. For each investigated FE, a gamma spectrum measurement was performed along the vertical axis, with steps of 1 cm, in order to determine the axial distribution of the fission products. After the fuel elements underwent a relatively short cooling down period, different fission products were detected. The activity concentration was determined by calibrating the gamma detector with a standard calibration source of known activity and by MCNP6 simulations for the evaluation of self-absorption and geometric effects. Given the specific TRIGA fuel composition, a correction procedure is developed and used in this work for the measurement of the fission product Zr 95 . This measurement campaign is part of a more extended project aiming at the modelling of the TU Wien TRIGA reactor by means of different calculation codes (MCNP6, Serpent): the experimental results presented in this paper will be subsequently used for the benchmark of the models developed with the calculation codes. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Jolliffe, Lee
About 350,000 freelance magazine articles were purchased by magazine editors last year from the 22,000 freelancers and 225,000 would-be freelancers in the United States. A study examined the factors editors judge most important in selecting freelance magazine article proposals, using factor analysis and qualitative examination of persuasive…
Editorial Page Editors and Endorsements: Chain-owned vs. Independent Newspapers.
ERIC Educational Resources Information Center
St. Dizier, Byron
Questionnaires were sent to 114 of the 228 editorial page editors at newspapers in the United States with daily circulations greater than 50,000 for a study that compared (1) the editor-publisher relationship existing at chains to that found at independent papers, and (2) the 1984 presidential endorsements made by chains to those by independent…
Linguistic Prescriptivism in Letters to the Editor
ERIC Educational Resources Information Center
Lukac, Morana
2016-01-01
The public's concern with the fate of the standard language has been well documented in the history of the complaint tradition. The print media have for centuries featured letters to the editor on questions of language use. This study examines a corpus of 258 language-related letters to the editor published in the English-speaking print media. By…
Elmarakeby, Haitham; Arefiyan, Mostafa; Myers, Elijah; Li, Song; Grene, Ruth; Heath, Lenwood S
2017-12-01
The Beacon Editor is a cross-platform desktop application for the creation and modification of signal transduction pathways using the Systems Biology Graphical Notation Activity Flow (SBGN-AF) language. Prompted by biologists' requests for enhancements, the Beacon Editor includes numerous powerful features for the benefit of creation and presentation.
Letter from the Board of Directors of Astronomy & Astrophysics
NASA Astrophysics Data System (ADS)
Meynet, Georges
2005-07-01
1. New A&A memberships and scientific editorial structure for the Letters section At its meeting in Tartu, Estonia on 8 May 2004, the A&A Board of Directors decided to grant observer status on the Board to Brazil, Chile, and Portugal (Sandqvist 2004, A&A, 426, E15). Then on 6-7 May 2005, at its meeting in La Laguna, Spain, the Board of Directors admitted these three countries to full membership in A&A, starting 1 January 2006. The Letters Editor, Dr. P. Schneider, will complete his terms of service on 31 January 2006. A&A is indebted to him for his thoughtful and competent editing over the past several years. As a consequence of his departure, the Board has decided to restructure the manner in which the Letters will be handled as of 1 January 2006. The Associate Editor-in-Chief, Dr. M. Walmsley, will also become Editor-in-Chief for the Letters, and he will forward the Letters to the appropriate topical Associate Editor to organize the reviewing process. Likewise, the Editor-in-Chief, Dr. C. Bertout, will become the Associate Letters-Editor-in-Chief. This change will permit a more specialized treatment of Letters in the future and also allow Letters to benefit from language editing. Hence, after 1 January 2006, manuscripts for Letters should be submitted via the A&A Manuscript Management System (MMS) that is already in place for Main Journal submissions. Letters submitted before that will be handled by the current Letters Editor even after 1 January 2006. 2. New Associate Editor positions Considering both the increased workload on the Associate Editors due to the above change and the continuing specialization of sub-fields in astronomy, the Board decided to open two new positions for Associate Editors, one specialized in Cosmology with a particular interest in theoretical aspects and the other in Observational Stellar Physics. Applications are invited for these two new positions. The Associate Editors are expected to have a broad knowledge of astronomy and astrophysics and to have expertise in one of these two sub-fields. Candidates should have a strong record of published research in astronomy and astrophysics, should have experience as a referee and/or journal editor, and be prepared to commit the time needed to oversee the peer review of up to three hundred papers per year. Limited support for office equipment and secretarial help, as well as an annual indemnity, will be provided to the Associate Editors, and the initial term of appointment is three years. Applicants should submit a curriculum vitae, a list of publications, and a concise covering letter that summarizes the candidate's qualifications and the reasons for seeking an Associate Editor position. The likelihood of support from the home institute for the task should also be discussed in the application. Applications should preferably be e-mailed or sent/faxed to the Chairman of the Board of Directors: Dr. Georges Meynet, Geneva Observatory, 1290 Sauverny, Switzerland, email georges.meynet@obs.unige.ch, Fax:+41 22 37 92205. Applications received by 1 October 2005 will receive full consideration, while informal inquiries about the positions may be directed by e-mail to Georges Meynet. On behalf of the Board of Directors Georges Meynet
Galipeau, James; Cobey, Kelly D; Barbour, Virginia; Baskin, Patricia; Bell-Syer, Sally; Deeks, Jonathan; Garner, Paul; Shamseer, Larissa; Sharon, Straus; Tugwell, Peter; Winker, Margaret; Moher, David
2017-01-01
Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well). The top five items on participants' list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 "highly rated" competency-related statements and another 86 "included" items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future.
2014-01-01
Background Wider adoption of reporting guidelines by veterinary journals could improve the quality of published veterinary research. The aims of this study were to assess the knowledge and views of veterinary Editors-in-Chief on reporting guidelines, identify the policies of their journals, and determine their information needs. Editors-in-Chief of 185 journals on the contact list for the International Association of Veterinary Editors (IAVE) were surveyed in April 2012 using an online questionnaire which contained both closed and open questions. Results The response rate was 36.8% (68/185). Thirty-six of 68 editors (52.9%) stated they knew what a reporting guideline was before receiving the questionnaire. Editors said they had found out about reporting guidelines primarily through articles in other journals, via the Internet and through their own journal. Twenty of 57 respondents (35.1%) said their journal referred to reporting guidelines in its instructions to authors. CONSORT, REFLECT, and ARRIVE were the most frequently cited. Forty-four of 68 respondents (68.2%) believed that reporting guidelines should be adopted by all refereed veterinary journals. Qualitative analysis of the open questions revealed that lack of knowledge, fear, resistance to change, and difficulty in implementation were perceived as barriers to the adoption of reporting guidelines by journals. Editors suggested that reporting guidelines be promoted through communication and education of the veterinary community, with roles for the IAVE and universities. Many respondents believed a consensus policy on guideline implementation was needed for veterinary journals. Conclusions Further communication and education about reporting guidelines for editors, authors and reviewers has the potential to increase their adoption by veterinary journals in the future. PMID:24410882
An increasing problem in publication ethics: Publication bias and editors' role in avoiding it.
Ekmekci, Perihan Elif
2017-06-01
Publication bias is defined as "the tendency on the parts of investigators, reviewers, and editors to submit or accept manuscripts for publication based on the direction or the strength of the study findings."Publication bias distorts the accumulated data in the literature, causes the over estimation of potential benefits of intervention and mantles the risks and adverse effects, and creates a barrier to assessing the clinical utility of drugs as well as evaluating the long-term safety of medical interventions. The World Medical Association, the International Committee of Medical Journals, and the Committee on Publication Ethics have conferred responsibilities and ethical obligations to editors concerning the avoidance of publication bias. Despite the explicit statements in these international documents, the editors' role in and ability to avoid publication bias is still being discussed. Unquestionably, all parties involved in clinical research have the ultimate responsibility to sustain the research integrity and validity of accumulated general knowledge. Cooperation and commitment is required at every step of a clinical trial. However, this holistic approach does not exclude effective measures to be taken at the editors' level. The editors of major medical journals concluded that one precaution that editors can take is to mandate registration of all clinical trials in a public repository as a precondition to submitting manuscripts to journals. Raising awareness regarding the value of publishing negative data for the scientific community and human health, and increasing the number of journals that are dedicated to publishing negative results or that set aside a section in their pages to do so, are positive steps editors can take to avoid publication bias.
Alfonso, Fernando; Adamyan, Karlen; Artigou, Jean-Yves; Aschermann, Michael; Boehm, Michael; Buendia, Alfonso; Chu, Pao-Hsien; Cohen, Ariel; Cas, Livio Dei; Dilic, Mirza; Doubell, Anton; Echeverri, Dario; Enç, Nuray; Ferreira-González, Ignacio; Filipiak, Krzysztof J; Flammer, Andreas; Fleck, Eckart; Gatzov, Plamen; Ginghina, Carmen; Goncalves, Lino; Haouala, Habib; Hassanein, Mahmoud; Heusch, Gerd; Huber, Kurt; Hulín, Ivan; Ivanusa, Mario; Krittayaphong, Rungroj; Lau, Chu-Pak; Marinskis, Germanas; Mach, François; Moreira, Luiz Felipe; Nieminen, Tuomo; Oukerraj, Latifa; Perings, Stefan; Pierard, Luc; Potpara, Tatjana; Reyes-Caorsi, Walter; Rim, Se-Joong; Rødevand, Olaf; Saade, Georges; Sander, Mikael; Shlyakhto, Evgeny; Timuralp, Bilgin; Tousoulis, Dimitris; Ural, Dilek; Piek, J J; Varga, Albert; Lüscher, Thomas F
2017-05-01
The International Committee of Medical Journal Editors (ICMJE) provides recommendations to improve the editorial standards and scientific quality of biomedical journals. These recommendations range from uniform technical requirements to more complex and elusive editorial issues including ethical aspects of the scientific process. Recently, registration of clinical trials, conflicts of interest disclosure, and new criteria for authorship - emphasizing the importance of responsibility and accountability-, have been proposed. Last year, a new editorial initiative to foster sharing of clinical trial data was launched. This review discusses this novel initiative with the aim of increasing awareness among readers, investigators, authors and editors belonging to the Editors´ Network of the European Society of Cardiology. Resumo O Comitê Internacional de Editores de Revistas Médicas (ICMJE) fornece recomendações para aprimorar o padrão editorial e a qualidade científica das revistas biomédicas. Tais recomendações variam desde requisitos técnicos de uniformização até assuntos editoriais mais complexos e elusivos, como os aspectos éticos do processo científico. Recentemente, foram propostos registro de ensaios clínicos, divulgação de conflitos de interesse e novos critérios de autoria, enfatizando a importância da responsabilidade e da responsabilização. No último ano, lançou-se uma nova iniciativa editorial para fomentar o compartilhamento dos dados de ensaios clínicos. Esta revisão discute essa nova iniciativa visando a aumentar a conscientização de leitores, investigadores, autores e editores filiados à Rede de Editores da Sociedade Europeia de Cardiologia.
Glujovsky, Demián; Villanueva, Eleana; Reveiz, Ludovic; Murasaki, Renato
2014-10-01
To evaluate the familiarity of the editors of journals indexed in the LILACS database with the guidelines for reporting on and publishing research- promoted by the EQUATOR Network (Enhancing QUAlity and Transparency Of Health Research)-, the journals' requirements for use of the guidelines, and the editors' opinions regarding the reasons for the low rate of use. LILACS editors were surveyed by e-mail about the guidelines and their availability at the EQUATOR website, and about the requirements and difficulties in using them. Of 802 editors, 16.4% answered the survey. More than half said they were not aware of the guidelines (especially STROBE and PRISMA) and 30% were familiar with the EQUATOR Network. The first Latin American and Caribbean study on LILACS editors' familiarity with the guidelines revealed that more than half of them were not familiar either with the guidelines or the EQUATOR Network.
Editorial highlighting and highly cited papers
NASA Astrophysics Data System (ADS)
Antonoyiannakis, Manolis
Editorial highlighting-the process whereby journal editors select, at the time of publication, a small subset of papers that are ostensibly of higher quality, importance or interest-is by now a widespread practice among major scientific journal publishers. Depending on the venue, and the extent to which editorial resources are invested in the process, highlighted papers appear as News & Views, Research Highlights, Perspectives, Editors' Choice, IOP Select, Editors' Summary, Spotlight on Optics, Editors' Picks, Viewpoints, Synopses, Editors' Suggestions, etc. Here, we look at the relation between highlighted papers and highly influential papers, which we define at two levels: having received enough citations to be among the (i) top few percent of their journal, and (ii) top 1% of all physics papers. Using multiple linear regression and multilevel regression modeling we examine the parameters associated with highly influential papers. We briefly comment on cause and effect relationships between citedness and highlighting of papers.
Sam, Brookhaven, and the Physical Review
NASA Astrophysics Data System (ADS)
Blume, Martin
2010-03-01
Sam Goudsmit came to Brookhaven National Laboratory in 1948, just after the first year of operation of the new institution, and after a year of his postwar appointment as Professor of Physics at Northwestern University. He was named an associate editor of the Physical Review at that time, under the then Managing Editor John T. Tate of the University of Minnesota. Tate had been Editor since 1926, and had presided over the growth of Physical Review to leadership of publication in the world of physics. Tate died in 1950, and after a search under an interim Editor Sam was, in 1951, named Managing Editor. In 1952 he became Chair of the Brookhaven Physics Department, founded Physical Review Letters, and served as department chair until 1960, when he stepped down but remained an Associate Chair. I will discuss my own interactions with Sam during this later period, when I learned of his many faceted talents and accomplishments.
Four pi calibration and modeling of a bare germanium detector in a cylindrical field source
NASA Astrophysics Data System (ADS)
Dewberry, R. A.; Young, J. E.
2012-05-01
In this paper we describe a 4π cylindrical field acquisition configuration surrounding a bare (unshielded, uncollimated) high purity germanium detector. We perform an efficiency calibration with a flexible planar source and model the configuration in the 4π cylindrical field. We then use exact calculus to model the flux on the cylindrical sides and end faces of the detector. We demonstrate that the model accurately represents the experimental detection efficiency compared to that of a point source and to Monte Carlo N-particle (MCNP) calculations of the flux. The model sums over the entire source surface area and the entire detector surface area including both faces and the detector's cylindrical sides. Agreement between the model and both experiment and the MCNP calculation is within 8%.
Shielding properties of 80TeO2-5TiO2-(15-x) WO3-xAnOm glasses using WinXCom and MCNP5 code
NASA Astrophysics Data System (ADS)
Dong, M. G.; El-Mallawany, R.; Sayyed, M. I.; Tekin, H. O.
2017-12-01
Gamma ray shielding properties of 80TeO2-5TiO2-(15-x) WO3-xAnOm glasses, where AnOm is Nb2O5 = 0.01, 5, Nd2O3 = 3, 5 and Er2O3 = 5 mol% have been achieved. Shielding parameters; mass attenuation coefficients, half value layers, and macroscopic effective removal cross section for fast neutrons have been computed by using WinXCom program and MCNP5 Monte Carlo code. In addition, by using Geometric Progression method (G-P), exposure buildup factor values were also calculated. Variations of shielding parameters are discussed for the effect of REO addition into the glasses and photon energy.
Enhancements to the MCNP6 background source
McMath, Garrett E.; McKinney, Gregg W.
2015-10-19
The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less
Neutron flux and power in RTP core-15
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabir, Mohamad Hairie, E-mail: m-hairie@nuclearmalaysia.gov.my; Zin, Muhammad Rawi Md; Usang, Mark Dennis
PUSPATI TRIGA Reactor achieved initial criticality on June 28, 1982. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes. This paper describes the reactor parameters calculation for the PUSPATI TRIGA REACTOR (RTP); focusing on the application of the developed reactor 3D model for criticality calculation, analysis of power and neutron flux distribution of TRIGA core. The 3D continuous energy Monte Carlo code MCNP was used to develop a versatile and accurate full model of the TRIGA reactor. The model represents in detailed all important components of the core withmore » literally no physical approximation. The consistency and accuracy of the developed RTP MCNP model was established by comparing calculations to the available experimental results and TRIGLAV code calculation.« less
SUMCOR: Cascade summing correction for volumetric sources applying MCNP6.
Dias, M S; Semmler, R; Moreira, D S; de Menezes, M O; Barros, L F; Ribeiro, R V; Koskinas, M F
2018-04-01
The main features of code SUMCOR developed for cascade summing correction for volumetric sources are described. MCNP6 is used to track histories starting from individual points inside the volumetric source, for each set of cascade transitions from the radionuclide. Total and FEP efficiencies are calculated for all gamma-rays and X-rays involved in the cascade. Cascade summing correction is based on the matrix formalism developed by Semkow et al. (1990). Results are presented applying the experimental data sent to the participants of two intercomparisons organized by the ICRM-GSWG and coordinated by Dr. Marie-Cristine Lépy from the Laboratoire National Henri Becquerel (LNE-LNHB), CEA, in 2008 and 2010, respectively and compared to the other participants in the intercomparisons. Copyright © 2017 Elsevier Ltd. All rights reserved.
Thanh, Minh‐Tri Ho; Munro, John J.
2015-01-01
The Source Production & Equipment Co. (SPEC) model M−15 is a new Iridium−192 brachytherapy source model intended for use as a temporary high‐dose‐rate (HDR) brachytherapy source for the Nucletron microSelectron Classic afterloading system. The purpose of this study is to characterize this HDR source for clinical application by obtaining a complete set of Monte Carlo calculated dosimetric parameters for the M‐15, as recommended by AAPM and ESTRO, for isotopes with average energies greater than 50 keV. This was accomplished by using the MCNP6 Monte Carlo code to simulate the resulting source dosimetry at various points within a pseudoinfinite water phantom. These dosimetric values next were converted into the AAPM and ESTRO dosimetry parameters and the respective statistical uncertainty in each parameter also calculated and presented. The M−15 source was modeled in an MCNP6 Monte Carlo environment using the physical source specifications provided by the manufacturer. Iridium−192 photons were uniformly generated inside the iridium core of the model M−15 with photon and secondary electron transport replicated using photoatomic cross‐sectional tables supplied with MCNP6. Simulations were performed for both water and air/vacuum computer models with a total of 4×109 sources photon history for each simulation and the in‐air photon spectrum filtered to remove low‐energy photons below δ=10%keV. Dosimetric data, including D(r,θ),gL(r),F(r,θ),Φan(r), and φ¯an, and their statistical uncertainty were calculated from the output of an MCNP model consisting of an M−15 source placed at the center of a spherical water phantom of 100 cm diameter. The air kerma strength in free space, SK, and dose rate constant, Λ, also was computed from a MCNP model with M−15 Iridium−192 source, was centered at the origin of an evacuated phantom in which a critical volume containing air at STP was added 100 cm from the source center. The reference dose rate, D˙(r0,θ0)≡D˙(1cm,π/2), is found to be 4.038±0.064 cGy mCi−1 h−1. The air kerma strength, SK, is reported to be 3.632±0.086 cGy cm2 mCi−1 g−1, and the dose rate constant, Λ, is calculated to be 1.112±0.029 cGy h−1 U−1. The normalized dose rate, radial dose function, and anisotropy function with their uncertainties were computed and are represented in both tabular and graphical format in the report. A dosimetric study was performed of the new M−15 Iridium−192 HDR brachytherapy source using the MCNP6 radiation transport code. Dosimetric parameters, including the dose‐rate constant, radial dose function, and anisotropy function, were calculated in accordance with the updated AAPM and ESTRO dosimetric parameters for brachytherapy sources of average energy greater than 50 keV. These data therefore may be applied toward the development of a treatment planning program and for clinical use of the source. PACS numbers: 87.56.bg, 87.53.Jw PMID:26103489
Visualization of nuclear particle trajectories in nuclear oil-well logging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Case, C.R.; Chiaramonte, J.M.
Nuclear oil-well logging measures specific properties of subsurface geological formations as a function of depth in the well. The knowledge gained is used to evaluate the hydrocarbon potential of the surrounding oil field. The measurements are made by lowering an instrument package into an oil well and slowly extracting it at a constant speed. During the extraction phase, neutrons or gamma rays are emitted from the tool, interact with the formation, and scatter back to the detectors located within the tool. Even though only a small percentage of the emitted particles ever reach the detectors, mathematical modeling has been verymore » successful in the accurate prediction of these detector responses. The two dominant methods used to model these devices have been the two-dimensional discrete ordinates method and the three-dimensional Monte Carlo method has routinely been used to investigate the response characteristics of nuclear tools. A special Los Alamos National Laboratory version of their standard MCNP Monte carlo code retains the details of each particle history of later viewing within SABRINA, a companion three-dimensional geometry modeling and debugging code.« less
Error Pattern Analysis Applied to Technical Writing: An Editor's Guide for Writers.
ERIC Educational Resources Information Center
Monagle, E. Brette
The use of error pattern analysis can reduce the time and money spent on editing and correcting manuscripts. What is required is noting, classifying, and keeping a frequency count of errors. First an editor should take a typical page of writing and circle each error. After the editor has done a sufficiently large number of pages to identify an…
ERIC Educational Resources Information Center
Farley, Peter C.
2017-01-01
Flowerdew and Dudley-Evans (2002) described a prototypical structure for decision letters based on a personal database of letters written by one editor for the journal "English for Specific Purposes." In this article, I analyse a publicly available corpus of 59 decision letters from 48 different editors of a wide range of scientific…
Letters to the Editor: Public Writing as a Response to Reading.
ERIC Educational Resources Information Center
Rinehammer, Nora
A study conducted by the copy editor of a small daily newspaper in Porter County, Indiana examines readers' motivations for writing letters to the editor. Analysis was based on letters that appeared in "The Vidette Messenger" September 16-30, 1992. Of 75 letters, 32 were responses to information published in the paper during the last 2…
Circadian Patterns of Wikipedia Editorial Activity: A Demographic Analysis
Yasseri, Taha; Sumi, Robert; Kertész, János
2012-01-01
Wikipedia (WP) as a collaborative, dynamical system of humans is an appropriate subject of social studies. Each single action of the members of this society, i.e., editors, is well recorded and accessible. Using the cumulative data of 34 Wikipedias in different languages, we try to characterize and find the universalities and differences in temporal activity patterns of editors. Based on this data, we estimate the geographical distribution of editors for each WP in the globe. Furthermore we also clarify the differences among different groups of WPs, which originate in the variance of cultural and social features of the communities of editors. PMID:22272279
Cai, Zhongli; Kwon, Yongkyu Luke; Reilly, Raymond M
2017-02-01
64 Cu emits positrons as well as β - particles and Auger and internal conversion electrons useful for radiotherapy. Our objective was to model the cellular dosimetry of 64 Cu under different geometries commonly used to study the cytotoxic effects of 64 Cu. Monte Carlo N-Particle (MCNP) was used to simulate the transport of all particles emitted by 64 Cu from the cell surface (CS), cytoplasm (Cy), or nucleus (N) of a single cell; monolayer in a well (radius = 0.32-1.74 cm); or a sphere (radius = 50-6,000 μm) of cells to calculate S values. The radius of the cell and N ranged from 5 to 12 μm and 2 to 11 μm, respectively. S values were obtained by MIRDcell for comparison. MCF7/HER2-18 cells were exposed in vitro to 64 Cu-labeled trastuzumab. The subcellular distribution of 64 Cu was measured by cell fractionation. The surviving fraction was determined in a clonogenic assay. The relative differences of MCNP versus MIRDcell self-dose S values (S self ) for 64 Cu ranged from -0.2% to 3.6% for N to N (S N←N ), 2.3% to 8.6% for Cy to N (S N←Cy ), and -12.0% to 7.3% for CS to N (S N←CS ). The relative differences of MCNP versus MIRDcell cross-dose S values were 25.8%-30.6% for a monolayer and 30%-34% for a sphere, respectively. The ratios of S N←N versus S N←Cy and S N←Cy versus S N←CS decreased with increasing ratio of the N of the cell versus radius of the cell and the size of the monolayer or sphere. The surviving fraction of MCF7 /: HER2-18 cells treated with 64 Cu-labeled trastuzumab (0.016-0.368 MBq/μg, 67 nM) for 18 h versus the absorbed dose followed a linear survival curve with α = 0.51 ± 0.05 Gy -1 and R 2 = 0.8838. This is significantly different from the linear quadratic survival curve of MCF7 /: HER2-18 cells exposed to γ-rays. MCNP- and MIRDcell-calculated S values agreed well. 64 Cu in the N increases the dose to the N in isolated single cells but has less effect in a cell monolayer or small cluster of cells simulating a micrometastasis, and little effect in a sphere analogous to a tumor xenograft compared with 64 Cu in the Cy or on the CS. The dose deposited by 64 Cu is less effective for cell killing than γ-rays. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marck, Steven C. van der, E-mail: vandermarck@nrg.eu
Recent releases of three major world nuclear reaction data libraries, ENDF/B-VII.1, JENDL-4.0, and JEFF-3.1.1, have been tested extensively using benchmark calculations. The calculations were performed with the latest release of the continuous energy Monte Carlo neutronics code MCNP, i.e. MCNP6. Three types of benchmarks were used, viz. criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 2000 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), tomore » mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for {sup 6}Li, {sup 7}Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D2O, H2O, concrete, polyethylene and teflon). The new functionality in MCNP6 to calculate the effective delayed neutron fraction was tested by comparison with more than thirty measurements in widely varying systems. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. The performance of the three libraries, in combination with MCNP6, is shown to be good. The results for the LEU-COMP-THERM category are on average very close to the benchmark value. Also for most other categories the results are satisfactory. Deviations from the benchmark values do occur in certain benchmark series, or in isolated cases within benchmark series. Such instances can often be related to nuclear data for specific non-fissile elements, such as C, Fe, or Gd. Indications are that the intermediate and mixed spectrum cases are less well described. The results for the shielding benchmarks are generally good, with very similar results for the three libraries in the majority of cases. Nevertheless there are, in certain cases, strong deviations between calculated and benchmark values, such as for Co and Mg. Also, the results show discrepancies at certain energies or angles for e.g. C, N, O, Mo, and W. The functionality of MCNP6 to calculate the effective delayed neutron fraction yields very good results for all three libraries.« less
Evaluation and Testing of the ADVANTG Code on SNM Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.
2013-09-24
Pacific Northwest National Laboratory (PNNL) has been tasked with evaluating the effectiveness of ORNL’s new hybrid transport code, ADVANTG, on scenarios of interest to our NA-22 sponsor, specifically of detection of diversion of special nuclear material (SNM). PNNL staff have determined that acquisition and installation of ADVANTG was relatively straightforward for a code in its phase of development, but probably not yet sufficient for mass distribution to the general user. PNNL staff also determined that with little effort, ADVANTG generated weight windows that typically worked for the problems and generated results consistent with MCNP. With slightly greater effort of choosingmore » a finer mesh around detectors or sample reaction tally regions, the figure of merit (FOM) could be further improved in most cases. This does take some limited knowledge of deterministic transport methods. The FOM could also be increased by limiting the energy range for a tally to the energy region of greatest interest. It was then found that an MCNP run with the full energy range for the tally showed improved statistics in the region used for the ADVANTG run. The specific case of interest chosen by the sponsor is the CIPN project from Las Alamos National Laboratory (LANL), which is an active interrogation, non-destructive assay (NDA) technique to quantify the fissile content in a spent fuel assembly and is also sensitive to cases of material diversion. Unfortunately, weight windows for the CIPN problem cannot currently be properly generated with ADVANTG due to inadequate accommodations for source definition. ADVANTG requires that a fixed neutron source be defined within the problem and cannot account for neutron multiplication. As such, it is rendered useless in active interrogation scenarios. It is also interesting to note that this is a difficult problem to solve and that the automated weight windows generator in MCNP actually slowed down the problem. Therefore, PNNL had determined that there is not an effective tool available for speeding up MCNP for problems such as the CIPN scenario. With regard to the Benchmark scenarios, ADVANTG performed very well for most of the difficult, long-running, standard radiation detection scenarios. Specifically, run time speedups were observed for spatially large scenarios, or those having significant shielding or scattering geometries. ADVANTG performed on par with existing codes for moderate sized scenarios, or those with little to moderate shielding, or multiple paths to the detectors. ADVANTG ran slower than MCNP for very simply, spatially small cases with little to no shielding that run very quickly anyway. Lastly, ADVANTG could not solve problems that did not consist of fixed source to detector geometries. For example, it could not solve scenarios with multiple detectors or secondary particles, such as active interrogation, neutron induced gamma, or fission neutrons.« less
DNAAlignEditor: DNA alignment editor tool
Sanchez-Villeda, Hector; Schroeder, Steven; Flint-Garcia, Sherry; Guill, Katherine E; Yamasaki, Masanori; McMullen, Michael D
2008-01-01
Background With advances in DNA re-sequencing methods and Next-Generation parallel sequencing approaches, there has been a large increase in genomic efforts to define and analyze the sequence variability present among individuals within a species. For very polymorphic species such as maize, this has lead to a need for intuitive, user-friendly software that aids the biologist, often with naïve programming capability, in tracking, editing, displaying, and exporting multiple individual sequence alignments. To fill this need we have developed a novel DNA alignment editor. Results We have generated a nucleotide sequence alignment editor (DNAAlignEditor) that provides an intuitive, user-friendly interface for manual editing of multiple sequence alignments with functions for input, editing, and output of sequence alignments. The color-coding of nucleotide identity and the display of associated quality score aids in the manual alignment editing process. DNAAlignEditor works as a client/server tool having two main components: a relational database that collects the processed alignments and a user interface connected to database through universal data access connectivity drivers. DNAAlignEditor can be used either as a stand-alone application or as a network application with multiple users concurrently connected. Conclusion We anticipate that this software will be of general interest to biologists and population genetics in editing DNA sequence alignments and analyzing natural sequence variation regardless of species, and will be particularly useful for manual alignment editing of sequences in species with high levels of polymorphism. PMID:18366684
Findings From the INANE Survey on Student Papers Submitted to Nursing Journals.
Kennedy, Maureen Shawn; Newland, Jamesetta A; Owens, Jacqueline K
Nursing students are often encouraged or required to submit scholarly work for consideration for publication but most manuscripts or course assignment papers do not meet journal standards and consume valuable resources from editors and peer reviewers. The International Academy of Nursing Editors (INANE) is a group of nurse editors and publishers dedicated to promoting best practices in publishing in the nursing literature. In August 2014, editors at INANE's annual meeting voiced frustrations over multiple queries, poorly written student papers, and lack of proper behavior in following through. This article describes the findings of a survey distributed to INANE members to seek feedback about submissions by students. Fifty-three (53) members responded to an online anonymous survey developed by the INANE Student Papers Work Group. Data were analyzed using descriptive statistics for Likert-type questions and content analysis of open-ended questions. Quantitative data revealed that most editors reported problems with student papers across all levels of graduate programs. Six themes emerged from the qualitative data: submissions fail to follow author guidelines; characteristics of student submissions; lack of professional behavior from students; lack of professional behavior from faculty; editor responses to student submissions; and faculty as mentors. These themes formed the basis for recommendations and strategies to improve student scholarly writing. Overall, editors endorsed supporting new scholars in the publication process but faculty engagement was integral to student success. Copyright © 2016 Elsevier Inc. All rights reserved.
Cantonwine, David E; Cordero, José F; Rivera-González, Luis O; Anzalota Del Toro, Liza V; Ferguson, Kelly K; Mukherjee, Bhramar; Calafat, Antonia M; Crespo, Noe; Jiménez-Vélez, Braulio; Padilla, Ingrid Y; Alshawabkeh, Akram N; Meeker, John D
2014-01-01
Phthalate contamination exists in the North Coast karst aquifer system in Puerto Rico. In light of potential health impacts associated with phthalate exposure, targeted action for elimination of exposure sources may be warranted, especially for sensitive populations such as pregnant women. However, information on exposure to phthalates from a variety of sources in Puerto Rico is lacking. The objective of this study was to determine concentrations and predictors of urinary phthalate biomarkers measured at multiple times during pregnancy among women living in the Northern karst area of Puerto Rico. We recruited 139 pregnant women in Northern Puerto Rico and collected urine samples and questionnaire data at three separate visits (18 ± 2 weeks, 22 ± 2 weeks, and 26 ± 2 weeks of gestation). Urine samples were analyzed for eleven phthalate metabolites: mono-2-ethylhexyl phthalate (MEHP), mono-2-ethyl-5-hydroxyhexyl phthalate, mono-2-ethyl-5-oxohexyl phthalate, mono-2-ethyl-5-carboxypentyl phthalate, mono-ethyl phthalate (MEP), mono-n-butyl phthalate, mono-benzyl phthalate, mono-isobutyl phthalate, mono-3-carboxypropyl phthalate (MCPP), mono carboxyisononyl phthalate (MCNP), and mono carboxyisooctyl phthalate (MCOP). Detectable concentrations of phthalate metabolites among pregnant women living in Puerto Rico was prevalent, and metabolite concentrations tended to be higher than or similar to those measured in women of reproductive age from the general US population. Intraclass correlation coefficients ranged from very weak (MCNP; 0.05) to moderate (MEP; 0.44) reproducibility among all phthalate metabolites. We observed significant or suggestive positive associations between urinary phthalate metabolite concentrations and water usage/storage habits (MEP, MCNP, MCOP), use of personal care products (MEP), and consumption of certain food items (MCPP, MCNP, and MCOP). To our knowledge this is the first study to report concentrations, temporal variability, and predictors of phthalate biomarkers among pregnant women in Puerto Rico. Preliminary results suggest several potentially important exposure sources to phthalates in this population and future analysis from this ongoing prospective cohort will help to inform targeted approaches to reduce exposure. © 2013.
Confirmation of a realistic reactor model for BNCT dosimetry at the TRIGA Mainz
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ziegner, Markus, E-mail: Markus.Ziegner.fl@ait.ac.at; Schmitz, Tobias; Hampel, Gabriele
2014-11-01
Purpose: In order to build up a reliable dose monitoring system for boron neutron capture therapy (BNCT) applications at the TRIGA reactor in Mainz, a computer model for the entire reactor was established, simulating the radiation field by means of the Monte Carlo method. The impact of different source definition techniques was compared and the model was validated by experimental fluence and dose determinations. Methods: The depletion calculation code ORIGEN2 was used to compute the burn-up and relevant material composition of each burned fuel element from the day of first reactor operation to its current core. The material composition ofmore » the current core was used in a MCNP5 model of the initial core developed earlier. To perform calculations for the region outside the reactor core, the model was expanded to include the thermal column and compared with the previously established ATTILA model. Subsequently, the computational model is simplified in order to reduce the calculation time. Both simulation models are validated by experiments with different setups using alanine dosimetry and gold activation measurements with two different types of phantoms. Results: The MCNP5 simulated neutron spectrum and source strength are found to be in good agreement with the previous ATTILA model whereas the photon production is much lower. Both MCNP5 simulation models predict all experimental dose values with an accuracy of about 5%. The simulations reveal that a Teflon environment favorably reduces the gamma dose component as compared to a polymethyl methacrylate phantom. Conclusions: A computer model for BNCT dosimetry was established, allowing the prediction of dosimetric quantities without further calibration and within a reasonable computation time for clinical applications. The good agreement between the MCNP5 simulations and experiments demonstrates that the ATTILA model overestimates the gamma dose contribution. The detailed model can be used for the planning of structural modifications in the thermal column irradiation channel or the use of different irradiation sites than the thermal column, e.g., the beam tubes.« less
Addressing Fission Product Validation in MCNP Burnup Credit Criticality Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Don; Bowen, Douglas G; Marshall, William BJ J
2015-01-01
The US Nuclear Regulatory Commission (NRC) Division of Spent Fuel Storage and Transportation issued Interim Staff Guidance (ISG) 8, Revision 3 in September 2012. This ISG provides guidance for NRC staff members’ review of burnup credit (BUC) analyses supporting transport and dry storage of pressurized water reactor spent nuclear fuel (SNF) in casks. The ISG includes guidance for addressing validation of criticality (k eff) calculations crediting the presence of a limited set of fission products and minor actinides (FP&MAs). Based on previous work documented in NRC Regulatory Guide (NUREG) Contractor Report (CR)-7109, the ISG recommends that NRC staff members acceptmore » the use of either 1.5 or 3% of the FP&MA worth—in addition to bias and bias uncertainty resulting from validation of k eff calculations for the major actinides in SNF—to conservatively account for the bias and bias uncertainty associated with the specified unvalidated FP&MAs. The ISG recommends (1) use of 1.5% of the FP&MA worth if a modern version of SCALE and its nuclear data are used and (2) 3% of the FP&MA worth for well qualified, industry standard code systems other than SCALE with the Evaluated Nuclear Data Files, Part B (ENDF/B),-V, ENDF/B-VI, or ENDF/B-VII cross sections libraries. The work presented in this paper provides a basis for extending the use of the 1.5% of the FP&MA worth bias to BUC criticality calculations performed using the Monte Carlo N-Particle (MCNP) code. The extended use of the 1.5% FP&MA worth bias is shown to be acceptable by comparison of FP&MA worths calculated using SCALE and MCNP with ENDF/B-V, -VI, and -VII–based nuclear data. The comparison supports use of the 1.5% FP&MA worth bias when the MCNP code is used for criticality calculations, provided that the cask design is similar to the hypothetical generic BUC-32 cask model and that the credited FP&MA worth is no more than 0.1 Δk eff (ISG-8, Rev. 3, Recommendation 4).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franco, Manuel
The objective of this work was to characterize the neutron irradiation system consisting of americium-241 beryllium (241AmBe) neutron sources placed in a polyethylene shielding for use at Sandia National Laboratories (SNL) Low Dose Rate Irradiation Facility (LDRIF). With a total activity of 0.3 TBq (9 Ci), the source consisted of three recycled 241AmBe sources of different activities that had been combined into a single source. The source in its polyethylene shielding will be used in neutron irradiation testing of components. The characterization of the source-shielding system was necessary to evaluate the radiation environment for future experiments. Characterization of the sourcemore » was also necessary because the documentation for the three component sources and their relative alignment within the Special Form Capsule (SFC) was inadequate. The system consisting of the source and shielding was modeled using Monte Carlo N-Particle transport code (MCNP). The model was validated by benchmarking it against measurements using multiple techniques. To characterize the radiation fields over the full spatial geometry of the irradiation system, it was necessary to use a number of instruments of varying sensitivities. First, the computed photon radiography assisted in determining orientation of the component sources. With the capsule properly oriented inside the shielding, the neutron spectra were measured using a variety of techniques. A N-probe Microspec and a neutron Bubble Dosimeter Spectrometer (BDS) set were used to characterize the neutron spectra/field in several locations. In the third technique, neutron foil activation was used to ascertain the neutron spectra. A high purity germanium (HPGe) detector was used to characterize the photon spectrum. The experimentally measured spectra and the MCNP results compared well. Once the MCNP model was validated to an adequate level of confidence, parametric analyses was performed on the model to optimize for potential experimental configurations and neutron spectra for component irradiation. The final product of this work is a MCNP model validated by measurements, an overall understanding of neutron irradiation system including photon/neutron transport and effective dose rates throughout the system, and possible experimental configurations for future irradiation of components.« less
PhytoPath: an integrative resource for plant pathogen genomics.
Pedro, Helder; Maheswari, Uma; Urban, Martin; Irvine, Alistair George; Cuzick, Alayne; McDowall, Mark D; Staines, Daniel M; Kulesha, Eugene; Hammond-Kosack, Kim Elizabeth; Kersey, Paul Julian
2016-01-04
PhytoPath (www.phytopathdb.org) is a resource for genomic and phenotypic data from plant pathogen species, that integrates phenotypic data for genes from PHI-base, an expertly curated catalog of genes with experimentally verified pathogenicity, with the Ensembl tools for data visualization and analysis. The resource is focused on fungi, protists (oomycetes) and bacterial plant pathogens that have genomes that have been sequenced and annotated. Genes with associated PHI-base data can be easily identified across all plant pathogen species using a BioMart-based query tool and visualized in their genomic context on the Ensembl genome browser. The PhytoPath resource contains data for 135 genomic sequences from 87 plant pathogen species, and 1364 genes curated for their role in pathogenicity and as targets for chemical intervention. Support for community annotation of gene models is provided using the WebApollo online gene editor, and we are working with interested communities to improve reference annotation for selected species. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
ERIC Educational Resources Information Center
Schierhorn, Ann B.; Endres, Kathleen L.
Editors of business and consumer magazines chosen by a random sample were asked in a mail survey what method they used in working with staff writers and free-lance writers. They were asked how they work with writers in the five stages of the writing process--idea, reporting, organizing, writing and rewriting. The first mailing to consumer…
ERIC Educational Resources Information Center
Albers, Craig A.; Floyd, Randy G.; Fuhrmann, Melanie J.; Martinez, Rebecca S.
2011-01-01
Two online surveys were completed by editors, associate editors, editorial board members, and members or fellows of the Division 16 of the American Psychological Association. These surveys targeted (a) the criteria for a manuscript to be published in school psychology journals, and (b) the components of the peer-review process that should be…
The WebACS - An Accessible Graphical Editor.
Parker, Stefan; Nussbaum, Gerhard; Pölzer, Stephan
2017-01-01
This paper is about the solution to accessibility problems met when implementing a graphical editor, a major challenge being the comprehension of the relationships between graphical components, which needs to be guaranteed for blind and vision impaired users. In the concrete case the HTML5 canvas and Javascript were used. Accessibility was reached by implementing a list view of elements, which also enhances the usability of the editor.
Applied Computational Electromagnetics Society Journal, Volume 9, Number 2
1994-07-01
input/output standardization; code or technique optimization and error minimization; innovations in solution technique or in data input/output...THE APPLIED COMPUTATIONAL ELECTROMAGNETICS SOCIETY JOURNAL EDITORS 3DITOR-IN-CH•IF/ACES EDITOR-IN-CHIEP/JOURNAL MANAGING EDITOR W. Perry Wheless...Adalbert Konrad and Paul P. Biringer Department of Electrical and Computer Engineering, University of Toronto Toronto, Ontario, CANADA M5S 1A4 Ailiwir
Joint Force Quarterly. Number 1, Summer 1993
1993-01-01
Contributors Joint Force Quarterly A PROFESSIONAL MILITARY JOURNAL Editor-in-Chief Alvin H. Bernstein Executive Editor Patrick M. Cronin Managing Editor Robert...understanding of the integrated employ- ment of land, sea, air, space, and special operations forces. The journal focuses on joint doctrine, coalition...other agency of the Federal Government. Por- tions of this journal are protected by copyright and may not be reproduced or extracted without the
R. E. (Ted) Munn — Founding editor; a mini-biography
NASA Astrophysics Data System (ADS)
Taylor, Peter; Thomas, Morley; Truhlar, Ed; Whelpdale, Doug
1996-02-01
Ted Munn founded Boundary-Layer Meteorology in 1970 and served as Editor for 75 volumes over a 25 year period. This short article briefly reviews Ted's scientific career with the Atmospheric Environment Service (of Canada), the International Institute for Applied Systems Analysis in Austria and with the Institute of Environmental Studies at the University of Toronto, and as editor of this journal.
STEVE -- a thinking person's screen editor
NASA Astrophysics Data System (ADS)
Fish, Adrian
STEve is an acronym for STarlink EVE and is an extended EDT-style EVE editor for use at Starlink nodes. The facility provides extra commands which are not part of standard EVE, and improves on one or two of the standard EVE commands. Help on all topics and keys is available from within the editor. The extensions and modifications present in STEve are particularly useful to Starlink users.
JGR special issue on Deep Earthquakes
NASA Astrophysics Data System (ADS)
The editor and associate editors of the Journal of Geophysical Research—Solid Earth and Planets invite the submission of manuscripts for a special issue on the topic “Deep- and Intermediate-Focus Earthquakes, Phase Transitions, and the Mechanics of Deep Subduction.”Manuscripts should be submitted to JGR Editor Gerald Schubert (Department of Earth and Space Sciences, University of California, Los Angeles, Los Angeles, CA 90024) before July 1, 1986, in accordance with the usual rules for manuscript submission. Submitted papers will undergo the normal JGR review procedure. For more information, contact either Schubert or the special guest associate editor, Cliff Frohlich (Institute for Geophysics, University of Texas at Austin, 4920 North IH-35, Austin, TX 78751; telephone: 512-451-6223).
Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.
1991-01-01
The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.
Core competencies for scientific editors of biomedical journals: consensus statement.
Moher, David; Galipeau, James; Alam, Sabina; Barbour, Virginia; Bartolomeos, Kidist; Baskin, Patricia; Bell-Syer, Sally; Cobey, Kelly D; Chan, Leighton; Clark, Jocalyn; Deeks, Jonathan; Flanagin, Annette; Garner, Paul; Glenny, Anne-Marie; Groves, Trish; Gurusamy, Kurinchi; Habibzadeh, Farrokh; Jewell-Thomas, Stefanie; Kelsall, Diane; Lapeña, José Florencio; MacLehose, Harriet; Marusic, Ana; McKenzie, Joanne E; Shah, Jay; Shamseer, Larissa; Straus, Sharon; Tugwell, Peter; Wager, Elizabeth; Winker, Margaret; Zhaori, Getu
2017-09-11
Scientific editors are responsible for deciding which articles to publish in their journals. However, we have not found documentation of their required knowledge, skills, and characteristics, or the existence of any formal core competencies for this role. We describe the development of a minimum set of core competencies for scientific editors of biomedical journals. The 14 key core competencies are divided into three major areas, and each competency has a list of associated elements or descriptions of more specific knowledge, skills, and characteristics that contribute to its fulfillment. We believe that these core competencies are a baseline of the knowledge, skills, and characteristics needed to perform competently the duties of a scientific editor at a biomedical journal.
Molecular structure input on the web.
Ertl, Peter
2010-02-02
A molecule editor, that is program for input and editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. This review focuses on a special type of molecule editors, namely those that are used for molecule structure input on the web. Scientific computing is now moving more and more in the direction of web services and cloud computing, with servers scattered all around the Internet. Thus a web browser has become the universal scientific user interface, and a tool to edit molecules directly within the web browser is essential.The review covers a history of web-based structure input, starting with simple text entry boxes and early molecule editors based on clickable maps, before moving to the current situation dominated by Java applets. One typical example - the popular JME Molecule Editor - will be described in more detail. Modern Ajax server-side molecule editors are also presented. And finally, the possible future direction of web-based molecule editing, based on technologies like JavaScript and Flash, is discussed.
Fox, Peter T; Bullmore, Ed; Bandettini, Peter A; Lancaster, Jack L
2009-02-01
Editors of scientific journals are ethically bound to provide a fair and impartial peer-review process and to protect the rights of contributing authors to publish research results. If, however, a dispute arises among investigators regarding data ownership and the right to publish, the ethical responsibilities of journal editors become more complex. The editors of Human Brain Mapping recently had the unusual experience of learning of an ongoing dispute regarding data-access rights pertaining to a manuscript already accepted for publication. Herein the editors describe the nature of the dispute, the steps taken to explore and resolve the conflict, and discuss the ethical principles that govern such circumstances. Drawing on this experience and with the goal of avoiding future controversies, the editors have formulated a Data Rights Policy and a Data Rights Procedure for Human Brain Mapping. Human Brain Mapping adopts this policy effective immediately and respectfully suggests that other journals consider adopting this or similar policies.
Surgery in World War 2. Activities of Surgical Consultants. Volume 1
1962-01-01
ACTIVITIES OF SURGICAL CONSULTANTS Volume I Prepared and published uinder the direction of Lieutenant General LEONARD D. HEATON The Surgeon, General, United...States Army Editor in Chief Colonel JOHN BlOYD COATES, Jr., MC Editor for Activities of Surgical Consultants B. NOLAND CARTER, M.D. Associate Editor...Chief, Information Activities Branch Major ALBRERT C. RIGoS, Jr., Chief, General Reference and Research Branch, TIAZEL G. HINE, Chief
Remarks from a retiring Editor
NASA Astrophysics Data System (ADS)
Mansur, Louis K.
2015-10-01
At the end of 2015 I plan to step down as Chairman of Editors for the Journal of Nuclear Materials. I use the opportunity to express thoughts that have recurred to me but were muted in comparison with the day to day priorities of editorial work. The most important is that I hold the deepest gratitude for your enduring support- authors, reviewers, readers, the Advisory Editorial Board, and my fellow Editors.
1994-01-01
In the 13 years since it was first published the "Uniform requirements for manuscripts submitted to biomedical journals" (the Vancouver style), developed by the International Committee of Medical Journal Editors, has been widely accepted by both authors and editors; over 400 journals have stated that they will consider manuscripts that conform to its requirements. This is the fourth edition of the "Uniform requirements." PMID:8287338
Development of PIMAL: Mathematical Phantom with Moving Arms and Legs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akkurt, Hatice; Eckerman, Keith F.
2007-05-01
The computational model of the human anatomy (phantom) has gone through many revisions since its initial development in the 1970s. The computational phantom model currently used by the Nuclear Regulatory Commission (NRC) is based on a model published in 1974. Hence, the phantom model used by the NRC staff was missing some organs (e.g., neck, esophagus) and tissues. Further, locations of some organs were inappropriate (e.g., thyroid).Moreover, all the computational phantoms were assumed to be in the vertical-upright position. However, many occupational radiation exposures occur with the worker in other positions. In the first phase of this work, updates onmore » the computational phantom models were reviewed and a revised phantom model, which includes the updates for the relevant organs and compositions, was identified. This revised model was adopted as the starting point for this development work, and hence a series of radiation transport computations, using the Monte Carlo code MCNP5, was performed. The computational results were compared against values reported by the International Commission on Radiation Protection (ICRP) in Publication 74. For some of the organs (e.g., thyroid), there were discrepancies between the computed values and the results reported in ICRP-74. The reasons behind these discrepancies have been investigated and are discussed in this report.Additionally, sensitivity computations were performed to determine the sensitivity of the organ doses for certain parameters, including composition and cross sections used in the simulations. To assess the dose for more realistic exposure configurations, the phantom model was revised to enable flexible positioning of the arms and legs. Furthermore, to reduce the user time for analyses, a graphical user interface (GUI) was developed. The GUI can be used to visualize the positioning of the arms and legs as desired posture is achieved to generate the input file, invoke the computations, and extract the organ dose values from the MCNP5 output file. In this report, the main features of the phantom model with moving arms and legs and user interface are described.« less
NASA Astrophysics Data System (ADS)
Chaudhary, A.; DeMarle, D.; Burnett, B.; Harris, C.; Silva, W.; Osmari, D.; Geveci, B.; Silva, C.; Doutriaux, C.; Williams, D. N.
2013-12-01
The impact of climate change will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. Long-term coordinated planning, funding, and action are required for climate change adaptation and mitigation. Unfortunately, widespread use of climate data (simulated and observed) in non-climate science communities is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. We present ClimatePipes to address many of these challenges by creating an open source platform that provides state-of-the-art, user-friendly data access, analysis, and visualization for climate and other relevant geospatial datasets, making the climate data available to non-researchers, decision-makers, and other stakeholders. The overarching goals of ClimatePipes are: - Enable users to explore real-world questions related to climate change. - Provide tools for data access, analysis, and visualization. - Facilitate collaboration by enabling users to share datasets, workflows, and visualization. ClimatePipes uses a web-based application platform for its widespread support on mainstream operating systems, ease-of-use, and inherent collaboration support. The front-end of ClimatePipes uses HTML5 (WebGL, Canvas2D, CSS3) to deliver state-of-the-art visualization and to provide a best-in-class user experience. The back-end of the ClimatePipes is built around Python using the Visualization Toolkit (VTK, http://vtk.org), Climate Data Analysis Tools (CDAT, http://uv-cdat.llnl.gov), and other climate and geospatial data processing tools such as GDAL and PROJ4. ClimatePipes web-interface to query and access data from remote sources (such as ESGF). Shown in the figure is climate data layer from ESGF on top of map data layer from OpenStreetMap. The ClimatePipes workflow editor provides flexibility and fine grained control, and uses the VisTrails (http://www.vistrails.org) workflow engine in the backend.
Benchmark of Atucha-2 PHWR RELAP5-3D control rod model by Monte Carlo MCNP5 core calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pecchia, M.; D'Auria, F.; Mazzantini, O.
2012-07-01
Atucha-2 is a Siemens-designed PHWR reactor under construction in the Republic of Argentina. Its geometrical complexity and peculiarities require the adoption of advanced Monte Carlo codes for performing realistic neutronic simulations. Therefore core models of Atucha-2 PHWR were developed using MCNP5. In this work a methodology was set up to collect the flux in the hexagonal mesh by which the Atucha-2 core is represented. The scope of this activity is to evaluate the effect of obliquely inserted control rod on neutron flux in order to validate the RELAP5-3D{sup C}/NESTLE three dimensional neutron kinetic coupled thermal-hydraulic model, applied by GRNSPG/UNIPI formore » performing selected transients of Chapter 15 FSAR of Atucha-2. (authors)« less
Feasibility study for wax deposition imaging in oil pipelines by PGNAA technique.
Cheng, Can; Jia, Wenbao; Hei, Daqian; Wei, Zhiyong; Wang, Hongtao
2017-10-01
Wax deposition in pipelines is a crucial problem in the oil industry. A method based on the prompt gamma-ray neutron activation analysis technique was applied to reconstruct the image of wax deposition in oil pipelines. The 2.223MeV hydrogen capture gamma rays were used to reconstruct the wax deposition image. To validate the method, both MCNP simulation and experiments were performed for wax deposited with a maximum thickness of 20cm. The performance of the method was simulated using the MCNP code. The experiment was conducted with a 252 Cf neutron source and a LaBr 3 : Ce detector. A good correspondence between the simulations and the experiments was observed. The results obtained indicate that the present approach is efficient for wax deposition imaging in oil pipelines. Copyright © 2017 Elsevier Ltd. All rights reserved.
MCNP simulation of the dose distribution in liver cancer treatment for BNC therapy
NASA Astrophysics Data System (ADS)
Krstic, Dragana; Jovanovic, Zoran; Markovic, Vladimir; Nikezic, Dragoslav; Urosevic, Vlade
2014-10-01
The Boron Neutron Capture Therapy ( BNCT) is based on selective uptake of boron in tumour tissue compared to the surrounding normal tissue. Infusion of compounds with boron is followed by irradiation with neutrons. Neutron capture on 10B, which gives rise to an alpha particle and recoiled 7Li ion, enables the therapeutic dose to be delivered to tumour tissue while healthy tissue can be spared. Here, therapeutic abilities of BNCT were studied for possible treatment of liver cancer using thermal and epithermal neutron beam. For neutron transport MCNP software was used and doses in organs of interest in ORNL phantom were evaluated. Phantom organs were filled with voxels in order to obtain depth-dose distributions in them. The result suggests that BNCT using an epithermal neutron beam could be applied for liver cancer treatment.
Benchmarking study of the MCNP code against cold critical experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, S.
1991-01-01
The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less
Simulations of neutron transport at low energy: a comparison between GEANT and MCNP.
Colonna, N; Altieri, S
2002-06-01
The use of the simulation tool GEANT for neutron transport at energies below 20 MeV is discussed, in particular with regard to shielding and dose calculations. The reliability of the GEANT/MICAP package for neutron transport in a wide energy range has been verified by comparing the results of simulations performed with this package in a wide energy range with the prediction of MCNP-4B, a code commonly used for neutron transport at low energy. A reasonable agreement between the results of the two codes is found for the neutron flux through a slab of material (iron and ordinary concrete), as well as for the dose released in soft tissue by neutrons. These results justify the use of the GEANT/MICAP code for neutron transport in a wide range of applications, including health physics problems.
Calculation of conversion coefficients for clinical photon spectra using the MCNP code.
Lima, M A F; Silva, A X; Crispim, V R
2004-01-01
In this work, the MCNP4B code has been employed to calculate conversion coefficients from air kerma to the ambient dose equivalent, H*(10)/Ka, for monoenergetic photon energies from 10 keV to 50 MeV, assuming the kerma approximation. Also estimated are the H*(10)/Ka for photon beams produced by linear accelerators, such as Clinac-4 and Clinac-2500, after transmission through primary barriers of radiotherapy treatment rooms. The results for the conversion coefficients for monoenergetic photon energies, with statistical uncertainty <2%, are compared with those in ICRP publication 74 and good agreements were obtained. The conversion coefficients calculated for real clinic spectra transmitted through walls of concrete of 1, 1.5 and 2 m thick, are in the range of 1.06-1.12 Sv Gy(-1).
NASA Astrophysics Data System (ADS)
Castanier, Eric; Paterne, Loic; Louis, Céline
2017-09-01
In the nuclear engineering, you have to manage time and precision. Especially in shielding design, you have to be more accurate and efficient to reduce cost (shielding thickness optimization), and for this, you use 3D codes. In this paper, we want to see if we can easily applicate the CADIS methods for design shielding of small pipes which go through large concrete walls. We assess the impact of the WW generated by the 3D-deterministic code ATTILA versus WW directly generated by MCNP (iterative and manual process). The comparison is based on the quality of the convergence (estimated relative error (σ), Variance of Variance (VOV) and Figure of Merit (FOM)), on time (computer time + modelling) and on the implement for the engineer.
Khattab, K; Sulieman, I
2009-04-01
The MCNP-4C code, based on the probabilistic approach, was used to model the 3D configuration of the core of the Syrian miniature neutron source reactor (MNSR). The continuous energy neutron cross sections from the ENDF/B-VI library were used to calculate the thermal and fast neutron fluxes in the inner and outer irradiation sites of MNSR. The thermal fluxes in the MNSR inner irradiation sites were also measured experimentally by the multiple foil activation method ((197)Au (n, gamma) (198)Au and (59)Co (n, gamma) (60)Co). The foils were irradiated simultaneously in each of the five MNSR inner irradiation sites to measure the thermal neutron flux and the epithermal index in each site. The calculated and measured results agree well.
Neutronics Investigations for the Lower Part of a Westinghouse SVEA-96+ Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, M.F.; Luethi, A.; Seiler, R.
2002-05-15
Accurate critical experiments have been performed for the validation of total fission (F{sub tot}) and {sup 238}U-capture (C{sub 8}) reaction rate distributions obtained with CASMO-4, HELIOS, BOXER, and MCNP4B for the lower axial region of a real Westinghouse SVEA-96+ fuel assembly. The assembly comprised fresh fuel with an average {sup 235}U enrichment of 4.02 wt%, a maximum enrichment of 4.74 wt%, 14 burnable-absorber fuel pins, and full-density water moderation. The experimental configuration investigated was core 1A of the LWR-PROTEUS Phase I project, where 61 different fuel pins, representing {approx}64% of the assembly, were gamma-scanned individually. Calculated (C) and measured (E)more » values have been compared in terms of C/E distributions. For F{sub tot}, the standard deviations are 1.2% for HELIOS, 0.9% for CASMO-4, 0.8% for MCNP4B, and 1.7% for BOXER. Standard deviations of 1.1% for HELIOS, CASMO-4, and MCNP4B and 1.2% for BOXER were obtained in the case of C{sub 8}. Despite the high degree of accuracy observed on the average, it was found that the five burnable-absorber fuel pins investigated showed a noticeable underprediction of F{sub tot}, quite systematically, for the deterministic codes evaluated (average C/E for the burnable-absorber fuel pins in the range 0.974 to 0.988, depending on the code)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mollerach, R.; Leszczynski, F.; Fink, J.
2006-07-01
In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out coveringmore » cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, A.; Davis, A.; University of Wisconsin-Madison, Madison, WI 53706
CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise tomore » extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)« less
NASA Astrophysics Data System (ADS)
Shypailo, R. J.; Ellis, K. J.
2011-05-01
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
NASA Astrophysics Data System (ADS)
Kamali-Zonouzi, P.; Shutt, A.; Nisbet, A.; Bradley, D. A.
2017-11-01
Preclinical investigations of thick microbeams show these to be feasible for use in radiotherapeutic dose delivery. To create the beams we access a radiotherapy x-ray tube that is familiarly used within a conventional clinical environment, coupling this with beam-defining grids. Beam characterisation, both single and in the form of arrays, has been by use of both MCNP simulation and direct Gafchromic EBT film dosimetry. As a first step in defining optimal exit-beam profiles over a range of beam energies, simulation has been made of the x-ray tube and numbers of beam-defining parallel geometry grids, the latter being made to vary in thickness, slit separation and material composition. For a grid positioned after the treatment applicator, and of similar design to those used in the first part of the study, MCNP simulation and Gafchromic EBT film were then applied in examining the resultant radiation profiles. MCNP simulations and direct dosimetry both show useful thick microbeams to be produced from the x-ray tube, with peak-to-valley dose ratios (PVDRs) in the approximate range 8.8-13.9. Although the potential to create thick microbeams using radiotherapy x-ray tubes and a grid has been demonstrated, Microbeam Radiation Therapy (MRT) would still need to be approved outside of the preclinical setting, a viable treatment technique of clinical interest needing to benefit for instance from substantially improved x-ray tube dose rates.
NASA Astrophysics Data System (ADS)
Karriem, Veronica V.
Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.
Adequacy of authors’ replies to criticism raised in electronic letters to the editor: cohort study
Delamothe, Tony; Godlee, Fiona; Lundh, Andreas
2010-01-01
Objective To investigate whether substantive criticism in electronic letters to the editor, defined as a problem that could invalidate the research or reduce its reliability, is adequately addressed by the authors. Design Cohort study. Setting BMJ between October 2005 and September 2007. Inclusion criteria Research papers generating substantive criticism in the rapid responses section on bmj.com. Main outcome measures Severity of criticism (minor, moderate, or major) as judged by two editors and extent to which the criticism was addressed by authors (fully, partly, or not) as judged by two editors and the critics. Results A substantive criticism was raised against 105 of 350 (30%, 95% confidence interval 25% to 35%) included research papers, and of these the authors had responded to 47 (45%, 35% to 54%). The severity of the criticism was the same in those papers as in the 58 without author replies (mean score 2.2 in both groups, P=0.72). For the 47 criticisms with replies, there was no relation between the severity of the criticism and the adequacy of the reply, neither as judged by the editors (P=0.88 and P=0.95, respectively) nor by the critics (P=0.83; response rate 85%). However, the critics were much more critical of the replies than the editors (average score 2.3 v 1.4, P<0.001). Conclusions Authors are reluctant to respond to criticisms of their work, although they are not less likely to respond when criticisms are severe. Editors should ensure that authors take relevant criticism seriously and respond adequately to it. PMID:20699306
Robot Sequencing and Visualization Program (RSVP)
NASA Technical Reports Server (NTRS)
Cooper, Brian K.; Maxwell,Scott A.; Hartman, Frank R.; Wright, John R.; Yen, Jeng; Toole, Nicholas T.; Gorjian, Zareh; Morrison, Jack C
2013-01-01
The Robot Sequencing and Visualization Program (RSVP) is being used in the Mars Science Laboratory (MSL) mission for downlink data visualization and command sequence generation. RSVP reads and writes downlink data products from the operations data server (ODS) and writes uplink data products to the ODS. The primary users of RSVP are members of the Rover Planner team (part of the Integrated Planning and Execution Team (IPE)), who use it to perform traversability/articulation analyses, take activity plan input from the Science and Mission Planning teams, and create a set of rover sequences to be sent to the rover every sol. The primary inputs to RSVP are downlink data products and activity plans in the ODS database. The primary outputs are command sequences to be placed in the ODS for further processing prior to uplink to each rover. RSVP is composed of two main subsystems. The first, called the Robot Sequence Editor (RoSE), understands the MSL activity and command dictionaries and takes care of converting incoming activity level inputs into command sequences. The Rover Planners use the RoSE component of RSVP to put together command sequences and to view and manage command level resources like time, power, temperature, etc. (via a transparent realtime connection to SEQGEN). The second component of RSVP is called HyperDrive, a set of high-fidelity computer graphics displays of the Martian surface in 3D and in stereo. The Rover Planners can explore the environment around the rover, create commands related to motion of all kinds, and see the simulated result of those commands via its underlying tight coupling with flight navigation, motor, and arm software. This software is the evolutionary replacement for the Rover Sequencing and Visualization software used to create command sequences (and visualize the Martian surface) for the Mars Exploration Rover mission.
Writing filter processes for the SAGA editor, appendix G
NASA Technical Reports Server (NTRS)
Kirslis, Peter A.
1985-01-01
The SAGA editor provides a mechanism by which separate processes can be invoked during an editing session to traverse portions of the parse tree being edited. These processes, termed filter processes, read, analyze, and possibly transform the parse tree, returning the result to the editor. By defining new commands with the editor's user defined command facility, which invoke filter processes, authors of filter can provide complex operations as simple commands. A tree plotter, pretty printer, and Pascal tree transformation program were already written using this facility. The filter processes are introduced, parse tree structure is described and the library interface made available to the programmer. Also discussed is how to compile and run filter processes. Examples are presented to illustrate aspect of each of these areas.
WITHDRAWN: Local causality in a Friedmann-Robertson-Walker spacetime
NASA Astrophysics Data System (ADS)
Christian, Joy
2016-10-01
This article has been withdrawn at the request of the Editors. Soon after the publication of this paper was announced, several experts in the field contacted the Editors to report errors. After extensive review, the Editors unanimously concluded that the results are in obvious conflict with a proven scientific fact, i.e., violation of local realism that has been demonstrated not only theoretically but experimentally in recent experiments. On this basis, the Editors decided to withdraw the paper. As a consequence, pages 67-79 originally occupied by the withdrawn article are missing from the printed issue. The publisher apologizes for any inconvenience this may cause. The full Elsevier Policy on Article Withdrawal can be found at http://www.elsevier.com/locate/withdrawalpolicy.
PDB Editor: a user-friendly Java-based Protein Data Bank file editor with a GUI.
Lee, Jonas; Kim, Sung Hou
2009-04-01
The Protein Data Bank file format is the format most widely used by protein crystallographers and biologists to disseminate and manipulate protein structures. Despite this, there are few user-friendly software packages available to efficiently edit and extract raw information from PDB files. This limitation often leads to many protein crystallographers wasting significant time manually editing PDB files. PDB Editor, written in Java Swing GUI, allows the user to selectively search, select, extract and edit information in parallel. Furthermore, the program is a stand-alone application written in Java which frees users from the hassles associated with platform/operating system-dependent installation and usage. PDB Editor can be downloaded from http://sourceforge.net/projects/pdbeditorjl/.
NASA Technical Reports Server (NTRS)
Raible, E.
1994-01-01
The Panel Library and Editor is a graphical user interface (GUI) builder for the Silicon Graphics IRIS workstation family. The toolkit creates "widgets" which can be manipulated by the user. Its appearance is similar to that of the X-Windows System. The Panel Library is written in C and is used by programmers writing user-friendly mouse-driven applications for the IRIS. GUIs built using the Panel Library consist of "actuators" and "panels." Actuators are buttons, dials, sliders, or other mouse-driven symbols. Panels are groups of actuators that occupy separate windows on the IRIS workstation. The application user can alter variables in the graphics program, or fire off functions with a click on a button. The evolution of data values can be tracked with meters and strip charts, and dialog boxes with text processing can be built. Panels can be stored as icons when not in use. The Panel Editor is a program used to interactively create and test panel library interfaces in a simple and efficient way. The Panel Editor itself uses a panel library interface, so all actions are mouse driven. Extensive context-sensitive on-line help is provided. Programmers can graphically create and test the user interface without writing a single line of code. Once an interface is judged satisfactory, the Panel Editor will dump it out as a file of C code that can be used in an application. The Panel Library (v9.8) and Editor (v1.1) are written in C-Language (63%) and Scheme, a dialect of LISP, (37%) for Silicon Graphics 4D series workstations running IRIX 3.2 or higher. Approximately 10Mb of disk space is required once compiled. 1.5Mb of main memory is required to execute the panel editor. This program is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format for an IRIS, and includes a copy of XScheme, the public-domain Scheme interpreter used by the Panel Editor. The Panel Library Programmer's Manual is included on the distribution media. The Panel Library and Editor were released to COSMIC in 1991. Silicon Graphics, IRIS, and IRIX are trademarks of Silicon Graphics, Inc. X-Window System is a trademark of Massachusetts Institute of Technology.
Galipeau, James; Cobey, Kelly D.; Barbour, Virginia; Baskin, Patricia; Bell-Syer, Sally; Deeks, Jonathan; Garner, Paul; Shamseer, Larissa; Sharon, Straus; Tugwell, Peter; Winker, Margaret; Moher, David
2017-01-01
Background: Scientific editors (i.e., those who make decisions on the content and policies of a journal) have a central role in the editorial process at biomedical journals. However, very little is known about the training needs of these editors or what competencies are required to perform effectively in this role. Methods: We conducted a survey of perceptions and training needs among scientific editors from major editorial organizations around the world, followed by a modified Delphi process in which we invited the same scientific editors to rate the importance of competency-related statements obtained from a previous scoping review. Results: A total of 148 participants completed the survey of perceptions and training needs. At least 80% of participants agreed on six of the 38 skill and expertise-related statements presented to them as being important or very important to their role as scientific editors. At least 80% agreed on three of the 38 statements as necessary skills they perceived themselves as possessing (well or very well). The top five items on participants’ list of top training needs were training in statistics, research methods, publication ethics, recruiting and dealing with peer reviewers, and indexing of journals. The three rounds of the Delphi were completed by 83, 83, and 73 participants, respectively, which ultimately produced a list of 23 “highly rated” competency-related statements and another 86 “included” items. Conclusion: Both the survey and the modified Delphi process will be critical for understanding knowledge and training gaps among scientific editors when designing curriculum around core competencies in the future. PMID:28979768
Callaham, Michael; John, Leslie K
2018-01-05
We define a minimally important difference for the Likert-type scores frequently used in scientific peer review (similar to existing minimally important differences for scores in clinical medicine). The magnitude of score change required to change editorial decisions has not been studied, to our knowledge. Experienced editors at a journal in the top 6% by impact factor were asked how large a change of rating in "overall desirability for publication" was required to trigger a change in their initial decision on an article. Minimally important differences were assessed twice for each editor: once assessing the rating change required to shift the editor away from an initial decision to accept, and the other assessing the magnitude required to shift away from an initial rejection decision. Forty-one editors completed the survey (89% response rate). In the acceptance frame, the median minimally important difference was 0.4 points on a scale of 1 to 5. Editors required a greater rating change to shift from an initial rejection decision; in the rejection frame, the median minimally important difference was 1.2 points. Within each frame, there was considerable heterogeneity: in the acceptance frame, 38% of editors did not change their decision within the maximum available range; in the rejection frame, 51% did not. To our knowledge, this is the first study to determine the minimally important difference for Likert-type ratings of research article quality, or in fact any nonclinical scientific assessment variable. Our findings may be useful for future research assessing whether changes to the peer review process produce clinically meaningful differences in editorial decisionmaking. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Ethical concerns of nursing reviewers: an international survey.
Broome, Marion; Dougherty, Molly C; Freda, Margaret C; Kearney, Margaret H; Baggs, Judith G
2010-11-01
Editors of scientific literature rely heavily on peer reviewers to evaluate the integrity of research conduct and validity of findings in manuscript submissions. The purpose of this study was to describe the ethical concerns of reviewers of nursing journals. This descriptive cross-sectional study was an anonymous online survey. The findings reported here were part of a larger investigation of experiences of reviewers. Fifty-two editors of nursing journals (six outside the USA) agreed to invite their review panels to participate. A 69-item forced-choice and open-ended survey developed by the authors based on the literature was pilot tested with 18 reviewers before being entered into SurveyMonkey(TM). A total of 1675 reviewers responded with useable surveys. Six questions elicited responses about ethical issues, such as conflict of interest, protection of human research participants, plagiarism, duplicate publication, misrepresentation of data and 'other'. The reviewers indicated whether they had experienced such a concern and notified the editor, and how satisfied they were with the outcome. They provided specific examples. Approximately 20% of the reviewers had experienced various ethical dilemmas. Although the majority reported their concerns to the editor, not all did so, and not all were satisfied with the outcomes. The most commonly reported concern perceived was inadequate protection of human participants. The least common was plagiarism, but this was most often reported to the editor and least often led to a satisfactory outcome. Qualitative responses at the end of the survey indicate this lack of satisfaction was most commonly related to feedback provided on resolution by the editor. The findings from this study suggest several areas that editors should note, including follow up with reviewers when they identify ethical concerns about a manuscript.
Joint Force Quarterly. Number 14, Winter 1996-97
1997-03-01
of the Joint Chiefs of Staff by the Institute for National Strategic Studies , National De- fense University, to promote understanding of the integrated...4219 e-mail: JFQ1@ndu.edu Internet: http://www.dtic.mil/doctrine Hans Binnendijk Director Institute for National Strategic Studies Editor-in-Chief... Studies Consulting Editor Calvin B. Kelley Copy Editor ISSN 1070–0692 March 1997 0314Pre 5/6/97 10:52 AM Page 3 competitors or new global powers
Reyes, Humberto B
2014-01-01
The International Committee of Medical Journal Editors is a leading independent institution providing guidance for the report of biomedical research and health related topics in medical journals. Established in 1978, it is currently constituted by editors of fourteen general medical journals from different countries, plus one representative for the US National Library of Medicine and one representative for the World Association of Biomedical Journal Editors. Since 1978 the Committee provides a document, originally named "Uniform Requirements…", "to help authors, editors, and others involved in peer review and biomedical publishing create and distribute accurate, clear, unbiased medical journal articles". This document has been updated several times and the last version was released in August 2013, now renamed "Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals", available in www.icmje.org and citable as "ICMJE Recommendations". A vast proportion of medical journals, worldwide, have adopted these recommendations as rules. The ICMJE discusses and provides guidance on several relevant aspects including criteria on authorship, peer review, scientific misconduct, conflicts of interest, clinical trials registration, good editorial practices, the relations between editors and journal owners, the protection of individuals subject to medical research, the solvency of electronic publications, among others. The 2013 ICMJE Annual Meeting took place in Santiago, Chile, in November 4 and 5. The photograph shows attendants to the final session.
Payao: a community platform for SBML pathway model curation
Matsuoka, Yukiko; Ghosh, Samik; Kikuchi, Norihiro; Kitano, Hiroaki
2010-01-01
Summary: Payao is a community-based, collaborative web service platform for gene-regulatory and biochemical pathway model curation. The system combines Web 2.0 technologies and online model visualization functions to enable a collaborative community to annotate and curate biological models. Payao reads the models in Systems Biology Markup Language format, displays them with CellDesigner, a process diagram editor, which complies with the Systems Biology Graphical Notation, and provides an interface for model enrichment (adding tags and comments to the models) for the access-controlled community members. Availability and implementation: Freely available for model curation service at http://www.payaologue.org. Web site implemented in Seaser Framework 2.0 with S2Flex2, MySQL 5.0 and Tomcat 5.5, with all major browsers supported. Contact: kitano@sbi.jp PMID:20371497
Kamide reflects on JGR and the role of editor
NASA Astrophysics Data System (ADS)
Woods, Peter
After serving the space physics community for more than 11 years, Y. Kamide of the Solar-Terrestrial Environment Laboratory at Nagoya University in Toyokawa, Japan, retired as editor of the Journal of Geophysical Research-Space Physics for the Asian/Pacific region. He had been a JGR editor since AGU first opened two editorial offices in Europe and the Asian/Pacific region in 1989. Even as the initial JGR editor in Asia, Kamide was not new to AGU editorial business. Before accepting the JGR position, Kamide served 3 years as the editor in Japan for Geophysical Research Letters.According to Kamide, over the last 5 years, the number of high-quality submissions to JGR in the Asian/Pacific region has increased dramatically, by a factor of 2.5. This increase came mostly from the younger generation of scientists, which bodes well for the future of JGR and space physics in general. Together with the substantial contributions to JGR from the European community, this achievement has been recognized by AGU as proof that JGR is truly an international journal of the highest editorial standards.
ERIC Educational Resources Information Center
Culkin, John; Drexel, John
1981-01-01
Media education specialist John Culkin talks with editor John Drexel about learning to read in the television age--and discusses a new alphabet, UNIFON, that may help solve the literacy crisis. (Editor)
Riley, R W
1983-07-08
They are unalike and far apart, these 13 past editors of The Journal. Between Nathan S. Davis's first issue and William R. Barclay's retirement, there was almost a century of change in medicine, society, the American Medical Association, prose style, and editorial needs. During these years, the editors ranged from the brilliant organizers John B. Hamilton and George H. Simmons to the diligent John H. Hollister and the devoted Johnson F. Hammond. There were editors with the hot determination of James C. Culbertson, John H. Talbott, and Robert H. Moser, and there were those with the cool precision of Austin Smith and Hugh H. Hussey. They varied from Morris Fishbein, who wrote and spoke "with the grade of an eagle in its unhindered soar," to Truman W. Miller, who wrote scarcely a word. Here, briefly, they are together.
NASA Astrophysics Data System (ADS)
Christian, Joy
2016-10-01
This article has been withdrawn at the request of the Editors. Soon after the publication of this paper was announced, several experts in the field contacted the Editors to report errors. After extensive review, the Editors unanimously concluded that the results are in obvious conflict with a proven scientific fact, i.e., violation of local realism that has been demonstrated not only theoretically but experimentally in recent experiments. On this basis, the Editors decided to withdraw the paper. As a consequence, pages 67-79 originally occupied by the withdrawn article are missing from the printed issue. The publisher apologizes for any inconvenience this may cause. The full Elsevier Policy on Article Withdrawal can be found at http://www.elsevier.com/locate/withdrawalpolicy.
Benchmark study for total enery electrons in thick slabs
NASA Technical Reports Server (NTRS)
Jun, I.
2002-01-01
The total energy deposition profiles when highenergy electrons impinge on a thick slab of elemental aluminum, copper, and tungsten have been computed using representative Monte Carlo codes (NOVICE, TIGER, MCNP), and compared in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweezy, Jeremy Ed
A photon next-event fluence estimator at a point has been implemented in the Monte Carlo Application Toolkit (MCATK). The next-event estimator provides an expected value estimator for the flux at a point due to all source and collision events. An advantage of the next-event estimator over track-length estimators, which are normally employed in MCATK, is that flux estimates can be made in locations that have no random walk particle tracks. The next-event estimator allows users to calculate radiographs and estimate response for detectors outside of the modeled geometry. The next-event estimator is not yet accessable through the MCATK FlatAPI formore » C and Fortran. The next-event estimator in MCATK has been tested against MCNP6 using 5 suites of test problems. No issues were found in the MCATK implementation. One issue was found in the exclusion radius approximation in MCNP6. The theory, implementation, and testing are described in this document.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soltz, R. A.; Danagoulian, A.; Sheets, S.
Theoretical calculations indicate that the value of the Feynman variance, Y2F for the emitted distribution of neutrons from ssionable exhibits a strong monotonic de- pendence on a the multiplication, M, of a quantity of special nuclear material. In 2012 we performed a series of measurements at the Passport Inc. facility using a 9- MeV bremsstrahlung CW beam of photons incident on small quantities of uranium with liquid scintillator detectors. For the set of objects studies we observed deviations in the expected monotonic dependence, and these deviations were later con rmed by MCNP simulations. In this report, we modify the theorymore » to account for the contri- bution from the initial photo- ssion and benchmark the new theory with a series of MCNP simulations on DU, LEU, and HEU objects spanning a wide range of masses and multiplication values.« less
Andrews, M. T.; Rising, M. E.; Meierbachtol, K.; ...
2018-06-15
Wmore » hen multiple neutrons are emitted in a fission event they are correlated in both energy and their relative angle, which may impact the design of safeguards equipment and other instrumentation for non-proliferation applications. The most recent release of MCNP 6 . 2 contains the capability to simulate correlated fission neutrons using the event generators CGMF and FREYA . These radiation transport simulations will be post-processed by the detector response code, DRiFT , and compared directly to correlated fission measurements. DRiFT has been previously compared to single detector measurements, its capabilities have been recently expanded with correlated fission simulations in mind. Finally, this paper details updates to DRiFT specific to correlated fission measurements, including tracking source particle energy of all detector events (and non-events), expanded output formats, and digitizer waveform generation.« less
Spectral unfolding of fast neutron energy distributions
NASA Astrophysics Data System (ADS)
Mosby, Michelle; Jackman, Kevin; Engle, Jonathan
2015-10-01
The characterization of the energy distribution of a neutron flux is difficult in experiments with constrained geometry where techniques such as time of flight cannot be used to resolve the distribution. The measurement of neutron fluxes in reactors, which often present similar challenges, has been accomplished using radioactivation foils as an indirect probe. Spectral unfolding codes use statistical methods to adjust MCNP predictions of neutron energy distributions using quantified radioactive residuals produced in these foils. We have applied a modification of this established neutron flux characterization technique to experimentally characterize the neutron flux in the critical assemblies at the Nevada National Security Site (NNSS) and the spallation neutron flux at the Isotope Production Facility (IPF) at Los Alamos National Laboratory (LANL). Results of the unfolding procedure are presented and compared with a priori MCNP predictions, and the implications for measurements using the neutron fluxes at these facilities are discussed.
NASA Astrophysics Data System (ADS)
Sboev, A. G.; Ilyashenko, A. S.; Vetrova, O. A.
1997-02-01
The method of bucking evaluation, realized in the MOnte Carlo code MCS, is described. This method was applied for calculational analysis of well known light water experiments TRX-1 and TRX-2. The analysis of this comparison shows, that there is no coincidence between Monte Carlo calculations, obtained by different ways: the MCS calculations with given experimental bucklings; the MCS calculations with given bucklings evaluated on base of full core MCS direct simulations; the full core MCNP and MCS direct simulations; the MCNP and MCS calculations, where the results of cell calculations are corrected by the coefficients taking into the account the leakage from the core. Also the buckling values evaluated by full core MCS calculations have differed from experimental ones, especially in the case of TRX-1, when this difference has corresponded to 0.5 percent increase of Keff value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, M. T.; Rising, M. E.; Meierbachtol, K.
Wmore » hen multiple neutrons are emitted in a fission event they are correlated in both energy and their relative angle, which may impact the design of safeguards equipment and other instrumentation for non-proliferation applications. The most recent release of MCNP 6 . 2 contains the capability to simulate correlated fission neutrons using the event generators CGMF and FREYA . These radiation transport simulations will be post-processed by the detector response code, DRiFT , and compared directly to correlated fission measurements. DRiFT has been previously compared to single detector measurements, its capabilities have been recently expanded with correlated fission simulations in mind. Finally, this paper details updates to DRiFT specific to correlated fission measurements, including tracking source particle energy of all detector events (and non-events), expanded output formats, and digitizer waveform generation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bily, T.
Thermoluminescent dosimeters represent very useful tool for gamma fields parameters measurements at nuclear research reactors, especially at zero power ones. {sup 7}LiF:Mg,Ti and {sup 7}LiF:Mg,Cu,P type TL dosimeters enable determination of only gamma component in mixed neutron - gamma field. At VR-1 reactor operated within the Faculty of Nuclear Sciences and Physical Engineering at the Czech Technical University in Prague the integral characteristics of gamma rays field were investigated, especially its spatial distribution and time behaviour, i.e. the non-saturated delayed gamma ray emission influence. Measured spatial distributions were compared with monte carlo code MCNP5 calculations. Although MCNP cannot generate delayedmore » gamma rays from fission, the relative gamma dose rate distribution is within {+-} 15% with measured values. The experiments were carried out with core configuration C1 consisting of LEU fuel IRT-4M (19.7 %). (author)« less
Comparison of Tungsten and Molybdenum Based Emitters for Advanced Thermionic Space Nuclear Reactors
NASA Astrophysics Data System (ADS)
Lee, Hsing H.; Dickinson, Jeffrey W.; Klein, Andrew C.; Lamp, Thomas R.
1994-07-01
Variations to the Advanced Thermionic Initiative thermionic fuel element are analyzed. Analysis included neutronic modeling with MCNP for criticality determination and thermal power distribution, and thermionic performance modeling with TFEHX. Changes to the original ATI configuration include the addition of W-HfC wire to the emitter for high temperature creep resistance improvement and substitution of molybdenum for the tungsten base material. Results from MCNP showed that all the tungsten used in the coating and base material must be 100% W-184 to obtain criticality. The presence of molybdenum in the emitter base affects the neutronic performance of the TFE by increasing the emitter neutron absorption cross section. Due to the reduced thermal conductivity for the molybdenum based emitter, a higher temperature is obtained resulting in a greater electrical power production. The thermal conductivity and resistivity of the composite emitter region were derived for the W-Mo composite and used in TFEHX.
Gas Core Reactor Numerical Simulation Using a Coupled MHD-MCNP Model
NASA Technical Reports Server (NTRS)
Kazeminezhad, F.; Anghaie, S.
2008-01-01
Analysis is provided in this report of using two head-on magnetohydrodynamic (MHD) shocks to achieve supercritical nuclear fission in an axially elongated cylinder filled with UF4 gas as an energy source for deep space missions. The motivation for each aspect of the design is explained and supported by theory and numerical simulations. A subsequent report will provide detail on relevant experimental work to validate the concept. Here the focus is on the theory of and simulations for the proposed gas core reactor conceptual design from the onset of shock generations to the supercritical state achieved when the shocks collide. The MHD model is coupled to a standard nuclear code (MCNP) to observe the neutron flux and fission power attributed to the supercritical state brought about by the shock collisions. Throughout the modeling, realistic parameters are used for the initial ambient gaseous state and currents to ensure a resulting supercritical state upon shock collisions.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bahreyni Toossi, Mohammad Taghi; Momennezhad, Mehdi; Hashemi, Seyed Mohammad
2012-01-01
Aim Exact knowledge of dosimetric parameters is an essential pre-requisite of an effective treatment in radiotherapy. In order to fulfill this consideration, different techniques have been used, one of which is Monte Carlo simulation. Materials and methods This study used the MCNP-4Cb to simulate electron beams from Neptun 10 PC medical linear accelerator. Output factors for 6, 8 and 10 MeV electrons applied to eleven different conventional fields were both measured and calculated. Results The measurements were carried out by a Wellhofler-Scanditronix dose scanning system. Our findings revealed that output factors acquired by MCNP-4C simulation and the corresponding values obtained by direct measurements are in a very good agreement. Conclusion In general, very good consistency of simulated and measured results is a good proof that the goal of this work has been accomplished. PMID:24377010
MCNP study for epithermal neutron irradiation of an isolated liver at the Finnish BNCT facility.
Kotiluoto, P; Auterinen, I
2004-11-01
A successful boron neutron capture treatment (BNCT) of a patient with multiple liver metastases has been first given in Italy, by placing the removed organ into the thermal neutron column of the Triga research reactor of the University of Pavia. In Finland, FiR 1 Triga reactor with an epithermal neutron beam well suited for BNCT has been extensively used to irradiate patients with brain tumors such as glioblastoma and recently also head and neck tumors. In this work we have studied by MCNP Monte Carlo simulations, whether it would be beneficial to treat an isolated liver with epithermal neutrons instead of thermal ones. The results show, that the epithermal field penetrates deeper into the liver and creates a build-up distribution of the boron dose. Our results strongly encourage further studying of irradiation arrangement of an isolated liver with epithermal neutron fields.
Image enhancement using MCNP5 code and MATLAB in neutron radiography.
Tharwat, Montaser; Mohamed, Nader; Mongy, T
2014-07-01
This work presents a method that can be used to enhance the neutron radiography (NR) image for objects with high scattering materials like hydrogen, carbon and other light materials. This method used Monte Carlo code, MCNP5, to simulate the NR process and get the flux distribution for each pixel of the image and determines the scattered neutron distribution that caused image blur, and then uses MATLAB to subtract this scattered neutron distribution from the initial image to improve its quality. This work was performed before the commissioning of digital NR system in Jan. 2013. The MATLAB enhancement method is quite a good technique in the case of static based film neutron radiography, while in neutron imaging (NI) technique, image enhancement and quantitative measurement were efficient by using ImageJ software. The enhanced image quality and quantitative measurements were presented in this work. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ceccolini, E.; Gerardy, I.; Ródenas, J.; van Dycke, M.; Gallardo, S.; Mostacci, D.
Brachytherapy is an advanced cancer treatment that is minimally invasive, minimising radiation exposure to the surrounding healthy tissues. Microselectron© Nucletron devices with 192Ir source can be used for gynaecological brachytherapy, in patients with vaginal or uterine cancer. Measurements of isodose curves have been performed in a PMMA phantom and compared with Monte Carlo calculations and TPS (Plato software of Nucletron BPS 14.2) evaluation. The isodose measurements have been performed with radiochromic films (Gafchromic EBT©). The dose matrix has been obtained after digitalisation and use of a dose calibration curve obtained with a 6 MV photon beam provided by a medical linear accelerator. A comparison between the calculated and the measured matrix has been performed. The calculated dose matrix is obtained with a simulation using the MCNP5 Monte Carlo code (F4MESH tally).
NASA Astrophysics Data System (ADS)
Ardiyati, Tanti; Rozali, Bang; Kasmudin
2018-02-01
An analysis of radiation penetration through the U-shaped joints of cast concrete shielding in BATAN’s multipurpose gamma irradiator has been carried out. The analysis has been performed by calculating the radiation penetration through the U-shaped joints of the concrete shielding using MCNP computer code. The U-shaped joints were a new design in massive concrete construction in Indonesia and, in its actual application, it is joined by a bonding agent. In the MCNP simulation model, eight detectors were located close to the observed irradiation room walls of the concrete shielding. The simulation results indicated that the radiation levels outside the concrete shielding was less than the permissible limit of 2.5 μSv/h so that the workers could safely access electrical room, control room, water treatment facility and outside irradiation room. The radiation penetration decreased as the density of material increased.
1988-09-01
analysis phase of the software life cycle (16:1-1). While editing a SADT diagram, the tool should be able to check whether or not structured analysis...diag-ams are valid for the SADT’s syntax, produce error messages, do error recovery, and perform editing suggestions. Thus, this tool must have the...directed editors are editors which use the syn- tax of the programming language while editing a program. While text editors treat programs as text, syntax
Guerrilla Violence in Colombia: Examining Causes and Consequences
1994-06-01
17 million at the end of 1931.4 The country along with most of 4 P caut, Daniel, Orden y Violencia: Colombia 1930-1954, Siglo veintiuno Editores, 1987...should no longer be a source of support for the masses.7 P0caut, Daniel, Orden y Violencia: Colombia 1930-1954, Siglo veintiuno Editores, 1987, p. 353. 14...Daniel, Orden y Violencia: Colombia 1930-1954, Siglo veintiuno Editores, 1987, p. 362. Ibid. 15 Gaitanism, the populist social movement led by Gait~n
STARLSE -- Starlink Extensions to the VAX Language Sensitive Editor
NASA Astrophysics Data System (ADS)
Warren-Smith, R. F.
STARLSE is a ``Starlink Sensitive'' editor based on the VAX Language Sensitive Editor (LSE). It exploits the extensibility of LSE to provide additional features which assist in the writing of portable Fortran 77 software with a standard Starlink style. STARLSE is intended mainly for use by those writing ADAM applications and subroutine libraries for distribution as part of the Starlink Software Collection, although it may also be suitable for other software projects. It is designed to integrate with the SST (Simple Software Tools) package.
Medical Department, United States Army. Surgery in World War 2. Neurosurgery. Volume 2
1959-01-01
that there was no obstruction distal to the opening. They might close spontaneously in either area. If the fistula was small and spontaneous closure did...States Army Editor in Chief Colonel JOHN BOYD COATES, Jr ., MC Editors for Neurosurgery R. GLEN SPURLING, M.D. BARNES WOODHALL, M.D. Associate Editor...M.D. M. ELAOT RANDOLPH, M.D. STERLING BUNNELL, M.D. (dec.) ISIDOR S. RAVDIN, M.D. NORTON CANFIELD, M.D. ALFRED R. SHANDS, Jr ., M.D. B. NOLAND CARTER
An editor for pathway drawing and data visualization in the Biopathways Workbench.
Byrnes, Robert W; Cotter, Dawn; Maer, Andreia; Li, Joshua; Nadeau, David; Subramaniam, Shankar
2009-10-02
Pathway models serve as the basis for much of systems biology. They are often built using programs designed for the purpose. Constructing new models generally requires simultaneous access to experimental data of diverse types, to databases of well-characterized biological compounds and molecular intermediates, and to reference model pathways. However, few if any software applications provide all such capabilities within a single user interface. The Pathway Editor is a program written in the Java programming language that allows de-novo pathway creation and downloading of LIPID MAPS (Lipid Metabolites and Pathways Strategy) and KEGG lipid metabolic pathways, and of measured time-dependent changes to lipid components of metabolism. Accessed through Java Web Start, the program downloads pathways from the LIPID MAPS Pathway database (Pathway) as well as from the LIPID MAPS web server http://www.lipidmaps.org. Data arises from metabolomic (lipidomic), microarray, and protein array experiments performed by the LIPID MAPS consortium of laboratories and is arranged by experiment. Facility is provided to create, connect, and annotate nodes and processes on a drawing panel with reference to database objects and time course data. Node and interaction layout as well as data display may be configured in pathway diagrams as desired. Users may extend diagrams, and may also read and write data and non-lipidomic KEGG pathways to and from files. Pathway diagrams in XML format, containing database identifiers referencing specific compounds and experiments, can be saved to a local file for subsequent use. The program is built upon a library of classes, referred to as the Biopathways Workbench, that convert between different file formats and database objects. An example of this feature is provided in the form of read/construct/write access to models in SBML (Systems Biology Markup Language) contained in the local file system. Inclusion of access to multiple experimental data types and of pathway diagrams within a single interface, automatic updating through connectivity to an online database, and a focus on annotation, including reference to standardized lipid nomenclature as well as common lipid names, supports the view that the Pathway Editor represents a significant, practicable contribution to current pathway modeling tools.
ERIC Educational Resources Information Center
Hoffert, Barbara; Heilbrun, Margaret; Kuzyk, Raya; Kim, Ann; McCormack, Heather; Katterjohn, Anna; Burns, Ann; Williams, Wilda
2008-01-01
From the fall's cascade of great new books, "Library Journal's" editors select their favorites--a dark rendition of Afghan life, a look at the "self-esteem trap," a celebration of Brooklyn activism, and much more.
Thompson, L.M.; Van Manen, F.T.; Schlarbaum, S.E.; DePoy, M.
2006-01-01
Incorporation of disease resistance is nearly complete for several important North American hardwood species threatened by exotic fungal diseases. The next important step toward species restoration would be to develop reliable tools to delineate ideal restoration sites on a landscape scale. We integrated spatial modeling and remote sensing techniques to delineate potential restoration sites for Butternut (Juglans cinerea L.) trees, a hardwood species being decimated by an exotic fungus, in Mammoth Cave National Park (MCNP), Kentucky. We first developed a multivariate habitat model to determine optimum Butternut habitats within MCNP. Habitat characteristics of 54 known Butternut locations were used in combination with eight topographic and land use data layers to calculate an index of habitat suitability based on Mahalanobis distance (D2). We used a bootstrapping technique to test the reliability of model predictions. Based on a threshold value for the D2 statistic, 75.9% of the Butternut locations were correctly classified, indicating that the habitat model performed well. Because Butternut seedlings require extensive amounts of sunlight to become established, we used canopy cover data to refine our delineation of favorable areas for Butternut restoration. Areas with the most favorable conditions to establish Butternut seedlings were limited to 291.6 ha. Our study provides a useful reference on the amount and location of favorable Butternut habitat in MCNP and can be used to identify priority areas for future Butternut restoration. Given the availability of relevant habitat layers and accurate location records, our approach can be applied to other tree species and areas. ?? 2006 Society for Ecological Restoration International.
Extensions of the MCNP5 and TRIPOLI4 Monte Carlo Codes for Transient Reactor Analysis
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Sjenitzer, Bart L.
2014-06-01
To simulate reactor transients for safety analysis with the Monte Carlo method the generation and decay of delayed neutron precursors is implemented in the MCNP5 and TRIPOLI4 general purpose Monte Carlo codes. Important new variance reduction techniques like forced decay of precursors in each time interval and the branchless collision method are included to obtain reasonable statistics for the power production per time interval. For simulation of practical reactor transients also the feedback effect from the thermal-hydraulics must be included. This requires coupling of the Monte Carlo code with a thermal-hydraulics (TH) code, providing the temperature distribution in the reactor, which affects the neutron transport via the cross section data. The TH code also provides the coolant density distribution in the reactor, directly influencing the neutron transport. Different techniques for this coupling are discussed. As a demonstration a 3x3 mini fuel assembly with a moving control rod is considered for MCNP5 and a mini core existing of 3x3 PWR fuel assemblies with control rods and burnable poisons for TRIPOLI4. Results are shown for reactor transients due to control rod movement or withdrawal. The TRIPOLI4 transient calculation is started at low power and includes thermal-hydraulic feedback. The power rises about 10 decades and finally stabilises the reactor power at a much higher level than initial. The examples demonstrate that the modified Monte Carlo codes are capable of performing correct transient calculations, taking into account all geometrical and cross section detail.
NASA Astrophysics Data System (ADS)
Liu, Tianyu; Wolfe, Noah; Lin, Hui; Zieb, Kris; Ji, Wei; Caracappa, Peter; Carothers, Christopher; Xu, X. George
2017-09-01
This paper contains two parts revolving around Monte Carlo transport simulation on Intel Many Integrated Core coprocessors (MIC, also known as Xeon Phi). (1) MCNP 6.1 was recompiled into multithreading (OpenMP) and multiprocessing (MPI) forms respectively without modification to the source code. The new codes were tested on a 60-core 5110P MIC. The test case was FS7ONNi, a radiation shielding problem used in MCNP's verification and validation suite. It was observed that both codes became slower on the MIC than on a 6-core X5650 CPU, by a factor of 4 for the MPI code and, abnormally, 20 for the OpenMP code, and both exhibited limited capability of strong scaling. (2) We have recently added a Constructive Solid Geometry (CSG) module to our ARCHER code to provide better support for geometry modelling in radiation shielding simulation. The functions of this module are frequently called in the particle random walk process. To identify the performance bottleneck we developed a CSG proxy application and profiled the code using the geometry data from FS7ONNi. The profiling data showed that the code was primarily memory latency bound on the MIC. This study suggests that despite low initial porting e_ort, Monte Carlo codes do not naturally lend themselves to the MIC platform — just like to the GPUs, and that the memory latency problem needs to be addressed in order to achieve decent performance gain.
NASA Astrophysics Data System (ADS)
Kouznetsov, A.; Cully, C. M.; Knudsen, D. J.
2016-12-01
Changes in D-Region ionization caused by energetic particle precipitation are monitored by the Array for Broadband Observations of VLF/ELF Emissions (ABOVE) - a network of receivers deployed across Western Canada. The observed amplitudes and phases of subionospheric-propagating VLF signals from distant artificial transmitters depend sensitively on the free electron population created by precipitation of energetic charged particles. Those include both primary (electrons, protons and heavier ions) and secondary (cascades of ionized particles and electromagnetic radiation) components. We have designed and implemented a full-scale model to predict the received VLF signals based on first-principle charged particle transport calculations coupled to the Long Wavelength Propagation Capability (LWPC) software. Calculations of ionization rates and free electron densities are based on MCNP-6 (a general-purpose Monte Carlo N- Particle) software taking advantage of its capability of coupled neutron/photon/electron transport and novel library of cross-sections for low-energetic electron and photon interactions with matter. Cosmic ray calculations of background ionization are based on source spectra obtained both from PAMELA direct Cosmic Rays spectra measurements and based on the recently-implemented MCNP 6 galactic cosmic-ray source, scaled using our (Calgary) neutron monitor measurement results. Conversion from calculated fluxes (MCNP F4 tallies) to ionization rates for low-energy electrons are based on the total ionization cross-sections for oxygen and nitrogen molecules from the National Institute of Standard and Technology. We use our model to explore the complexity of the physical processes affecting VLF propagation.
SOIL - A new open access journal of the European Geosciences Union
NASA Astrophysics Data System (ADS)
Brevik, Eric; Mataix-Solera, Jorge; Pereg, Lily; Quinton, John; Six, Johan; Van Oost, Kristof; Cerdà, Artemi
2014-05-01
The Soil System Sciences (SSS) division of the EGU has been a strong and growing international research force in the last few years. Since the first EGU meeting with SSS participation in 2004 where 200 abstracts were presented in 7 sessions, the contribution of the SSS division has grown considerably, with 1,427 abstracts presented in 57 SSS sessions at the 2013 EGU General Assembly. After 10 years of active participation, the SSS Division has developed a new open access journal, SOIL, which will serve the whole EGU membership. SOIL intends to publish scientific research that will contribute to understanding the Soil System and its interaction with humans and the entire Earth System. The scope of the journal will include all topics that fall within the study of soil science as a discipline, with an emphasis on studies that integrate soil science with other sciences (Soils and plants, Soils and water, Soils and atmosphere, Soils and biogeochemical cycling, Soils and the natural environment, Soils and the human environment, Soils and food security, Soils and biodiversity, Soils and global change, Soils and health, Soil as a resource, Soil systems, Soil degradation (chemical, physical and biological), Soil protection and remediation (including soil monitoring), Soils and methodologies). Manuscript types considered for publication in SOIL are original research articles, review articles, short communications, forum articles, and letters to the editors. SOIL will also publish up to two special issues on thematic subjects per year and encourages conveners of innovative sessions at the EGU meeting to submit proposals for special issues to the executive editor who oversees special issues. As with other EGU journals, SOIL has a two-stage publication process. In the first stage, papers that pass a rapid access-review by one of the editors will immediately be published in SOIL Discussions (SOIL-D). Papers will then be subject to interactive public discussion, during which the referees' comments (anonymous or attributed), additional short comments by other members of the scientific community (attributed), and the author's replies will also be published in SOIL-D. In the second stage, a peer-review and revision process is completed and, if accepted, finalized papers are published in SOIL. To ensure publication precedence for authors, and to provide a lasting record of scientific discussion, SOIL-D and SOIL are both ISSN-registered, permanently archived, and fully citable. SOIL has a team of five executive editors who work together to oversee the running of the journal. Those executive editors, and their areas of primary oversight, are Eric Brevik (Review Article Editor), Jorge Mataix-Solera (Special Issues Editor), John Quinton (Awards and Recognitions Editor), Johan Six (Managing Editor), and Kristof Van Oost (Forum Article Editor). SOIL also has 46 associate editors. Manuscripts can be submitted to SOIL at the journal's website (http://www.soil-journal.net/home.html) beginning in May 2014. The first issue will be published January of 2015. Publication fees will be waived for the first two years of publication.
Han, Min Cheol; Yeom, Yeon Soo; Lee, Hyun Su; Shin, Bangho; Kim, Chan Hyeong; Furuta, Takuya
2018-05-04
In this study, the multi-threading performance of the Geant4, MCNP6, and PHITS codes was evaluated as a function of the number of threads (N) and the complexity of the tetrahedral-mesh phantom. For this, three tetrahedral-mesh phantoms of varying complexity (simple, moderately complex, and highly complex) were prepared and implemented in the three different Monte Carlo codes, in photon and neutron transport simulations. Subsequently, for each case, the initialization time, calculation time, and memory usage were measured as a function of the number of threads used in the simulation. It was found that for all codes, the initialization time significantly increased with the complexity of the phantom, but not with the number of threads. Geant4 exhibited much longer initialization time than the other codes, especially for the complex phantom (MRCP). The improvement of computation speed due to the use of a multi-threaded code was calculated as the speed-up factor, the ratio of the computation speed on a multi-threaded code to the computation speed on a single-threaded code. Geant4 showed the best multi-threading performance among the codes considered in this study, with the speed-up factor almost linearly increasing with the number of threads, reaching ~30 when N = 40. PHITS and MCNP6 showed a much smaller increase of the speed-up factor with the number of threads. For PHITS, the speed-up factors were low when N = 40. For MCNP6, the increase of the speed-up factors was better, but they were still less than ~10 when N = 40. As for memory usage, Geant4 was found to use more memory than the other codes. In addition, compared to that of the other codes, the memory usage of Geant4 more rapidly increased with the number of threads, reaching as high as ~74 GB when N = 40 for the complex phantom (MRCP). It is notable that compared to that of the other codes, the memory usage of PHITS was much lower, regardless of both the complexity of the phantom and the number of threads, hardly increasing with the number of threads for the MRCP.
SIRE: A Simple Interactive Rule Editor for NICBES
NASA Technical Reports Server (NTRS)
Bykat, Alex
1988-01-01
To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.
Correction of β-thalassemia mutant by base editor in human embryos.
Liang, Puping; Ding, Chenhui; Sun, Hongwei; Xie, Xiaowei; Xu, Yanwen; Zhang, Xiya; Sun, Ying; Xiong, Yuanyan; Ma, Wenbin; Liu, Yongxiang; Wang, Yali; Fang, Jianpei; Liu, Dan; Songyang, Zhou; Zhou, Canquan; Huang, Junjiu
2017-11-01
β-Thalassemia is a global health issue, caused by mutations in the HBB gene. Among these mutations, HBB -28 (A>G) mutations is one of the three most common mutations in China and Southeast Asia patients with β-thalassemia. Correcting this mutation in human embryos may prevent the disease being passed onto future generations and cure anemia. Here we report the first study using base editor (BE) system to correct disease mutant in human embryos. Firstly, we produced a 293T cell line with an exogenous HBB -28 (A>G) mutant fragment for gRNAs and targeting efficiency evaluation. Then we collected primary skin fibroblast cells from a β-thalassemia patient with HBB -28 (A>G) homozygous mutation. Data showed that base editor could precisely correct HBB -28 (A>G) mutation in the patient's primary cells. To model homozygous mutation disease embryos, we constructed nuclear transfer embryos by fusing the lymphocyte or skin fibroblast cells with enucleated in vitro matured (IVM) oocytes. Notably, the gene correction efficiency was over 23.0% in these embryos by base editor. Although these embryos were still mosaic, the percentage of repaired blastomeres was over 20.0%. In addition, we found that base editor variants, with narrowed deamination window, could promote G-to-A conversion at HBB -28 site precisely in human embryos. Collectively, this study demonstrated the feasibility of curing genetic disease in human somatic cells and embryos by base editor system.
Harris, Alex H S; Reeder, Rachelle; Hyun, Jenny K
2009-10-01
Journal editors and statistical reviewers are often in the difficult position of catching serious problems in submitted manuscripts after the research is conducted and data have been analyzed. We sought to learn from editors and reviewers of major psychiatry journals what common statistical and design problems they most often find in submitted manuscripts and what they wished to communicate to authors regarding these issues. Our primary goal was to facilitate communication between journal editors/reviewers and researchers/authors and thereby improve the scientific and statistical quality of research and submitted manuscripts. Editors and statistical reviewers of 54 high-impact psychiatry journals were surveyed to learn what statistical or design problems they encounter most often in submitted manuscripts. Respondents completed the survey online. The authors analyzed survey text responses using content analysis procedures to identify major themes related to commonly encountered statistical or research design problems. Editors and reviewers (n=15) who handle manuscripts from 39 different high-impact psychiatry journals responded to the survey. The most commonly cited problems regarded failure to map statistical models onto research questions, improper handling of missing data, not controlling for multiple comparisons, not understanding the difference between equivalence and difference trials, and poor controls in quasi-experimental designs. The scientific quality of psychiatry research and submitted reports could be greatly improved if researchers became sensitive to, or sought consultation on frequently encountered methodological and analytic issues.
Bornmann, Lutz; Daniel, Hans-Dieter
2010-10-14
Ratings in journal peer review can be affected by sources of bias. The bias variable investigated here was the information on whether authors had suggested a possible reviewer for their manuscript, and whether the editor had taken up that suggestion or had chosen a reviewer that had not been suggested by the authors. Studies have shown that author-suggested reviewers rate manuscripts more favorably than editor-suggested reviewers do. Reviewers' ratings on three evaluation criteria and the reviewers' final publication recommendations were available for 552 manuscripts (in total 1145 reviews) that were submitted to Atmospheric Chemistry and Physics, an interactive open access journal using public peer review (authors' and reviewers' comments are publicly exchanged). Public peer review is supposed to bring a new openness to the reviewing process that will enhance its objectivity. In the statistical analysis the quality of a manuscript was controlled for to prevent favorable reviewers' ratings from being attributable to quality instead of to the bias variable. Our results agree with those from other studies that editor-suggested reviewers rated manuscripts between 30% and 42% less favorably than author-suggested reviewers. Against this backdrop journal editors should consider either doing without the use of author-suggested reviewers or, if they are used, bringing in more than one editor-suggested reviewer for the review process (so that the review by author-suggested reviewers can be put in perspective).
Bornmann, Lutz; Daniel, Hans-Dieter
2010-01-01
Background Ratings in journal peer review can be affected by sources of bias. The bias variable investigated here was the information on whether authors had suggested a possible reviewer for their manuscript, and whether the editor had taken up that suggestion or had chosen a reviewer that had not been suggested by the authors. Studies have shown that author-suggested reviewers rate manuscripts more favorably than editor-suggested reviewers do. Methodology/Principal Findings Reviewers' ratings on three evaluation criteria and the reviewers' final publication recommendations were available for 552 manuscripts (in total 1145 reviews) that were submitted to Atmospheric Chemistry and Physics, an interactive open access journal using public peer review (authors' and reviewers' comments are publicly exchanged). Public peer review is supposed to bring a new openness to the reviewing process that will enhance its objectivity. In the statistical analysis the quality of a manuscript was controlled for to prevent favorable reviewers' ratings from being attributable to quality instead of to the bias variable. Conclusions/Significance Our results agree with those from other studies that editor-suggested reviewers rated manuscripts between 30% and 42% less favorably than author-suggested reviewers. Against this backdrop journal editors should consider either doing without the use of author-suggested reviewers or, if they are used, bringing in more than one editor-suggested reviewer for the review process (so that the review by author-suggested reviewers can be put in perspective). PMID:20976226
Ethical dilemmas in scientific publication: pitfalls and solutions for editors.
Gollogly, Laragh; Momen, Hooman
2006-08-01
Editors of scientific journals need to be conversant with the mechanisms by which scientific misconduct is amplified by publication practices. This paper provides definitions, ways to document the extent of the problem, and examples of editorial attempts to counter fraud. Fabrication, falsification, duplication, ghost authorship, gift authorship, lack of ethics approval, non-disclosure, 'salami' publication, conflicts of interest, auto-citation, duplicate submission, duplicate publications, and plagiarism are common problems. Editorial misconduct includes failure to observe due process, undue delay in reaching decisions and communicating these to authors, inappropriate review procedures, and confounding a journal's content with its advertising or promotional potential. Editors also can be admonished by their peers for failure to investigate suspected misconduct, failure to retract when indicated, and failure to abide voluntarily by the six main sources of relevant international guidelines on research, its reporting and editorial practice. Editors are in a good position to promulgate reasonable standards of practice, and can start by using consensus guidelines on publication ethics to state explicitly how their journals function. Reviewers, editors, authors and readers all then have a better chance to understand, and abide by, the rules of publishing.
Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio
2014-01-01
Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23467006
Perez-Riverol, Yasset; Wang, Rui; Hermjakob, Henning; Müller, Markus; Vesada, Vladimir; Vizcaíno, Juan Antonio
2014-01-01
Data processing, management and visualization are central and critical components of a state of the art high-throughput mass spectrometry (MS)-based proteomics experiment, and are often some of the most time-consuming steps, especially for labs without much bioinformatics support. The growing interest in the field of proteomics has triggered an increase in the development of new software libraries, including freely available and open-source software. From database search analysis to post-processing of the identification results, even though the objectives of these libraries and packages can vary significantly, they usually share a number of features. Common use cases include the handling of protein and peptide sequences, the parsing of results from various proteomics search engines output files, and the visualization of MS-related information (including mass spectra and chromatograms). In this review, we provide an overview of the existing software libraries, open-source frameworks and also, we give information on some of the freely available applications which make use of them. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013 Elsevier B.V. All rights reserved.
Horatio Alger in the Newsroom: Social Origins of American Editors
ERIC Educational Resources Information Center
Hart, Jack R.
1976-01-01
Concludes that American newspaper editors of the late nineteenth and early twentieth centuries came from elite social backgrounds, which is contrary to the rags-to-riches image fostered by previous historians. (RB)
Genetics Home Reference: hemophilia
... Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): University of ... Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): University of ...
Genetics Home Reference: pontocerebellar hypoplasia
... Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): University of ... Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): University of ...
Profile: Institute of Society, Ethics and the Life Sciences
ERIC Educational Resources Information Center
Callahan, Daniel
1971-01-01
Describes an institute founded to examine moral, ethical, and legal issues raised by possibilities of euthanasia, genetic engineering, behavior control, population control, and improved disease control. Indicates scope of present research. (Editor/AL)
Genetics Home Reference: primary hyperoxaluria
... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ...
Genetics Home Reference: oculocutaneous albinism
... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ...
Genetics Home Reference: cutis laxa
... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ...
Genetics Home Reference: galactosemia
... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ... Bean LJH, Bird TD, Ledbetter N, Mefford HC, Smith RJH, Stephens K, editors. GeneReviews® [Internet]. Seattle (WA): ...
LDRD Final Review: Radiation Transport Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goorley, John Timothy; Morgan, George Lake; Lestone, John Paul
2017-06-22
Both high-fidelity & toy simulations are being used to understand measured signals and improve the Area 11 NDSE diagnostic. We continue to gain more and more confidence in the ability for MCNP to simulate neutron and photon transport from source to radiation detector.
2002-06-13
KENNEDY SPACE CENTER, FLA. -- The 2002 Florida Press Association and Florida Society of Newspaper Editors Convention offers a panel on space. At the podium is Bob Stover, managing editor, Florida Today. Panel participants enjoying a laugh are (left to right) Craig Covault, senior editor, Aviation Week; Howard Benedict, retired AP reporter; JoAnn Morgan, director, External Relations and Business Development, Kennedy Space Center; Marcia Dunn, AP reporter. The convention was held at the Debus Center, KSC Visitors Complex. Also speaking at the convention were Center Director Roy Bridges and NASA Associate Deputy Administrator Dr. Daniel Mulville
Arthroscopy Journal Prizes Are Major Decisions.
Lubowitz, James H; Brand, Jefferson C; Provencher, Matthew T; Rossi, Michael J
2016-01-01
According to the Harvard Business Review, the optimal number of people in a decision-making group is no more than 8. Thus, it is no surprise that 18 Arthroscopy journal associate editors had difficulty making a major decision. In the end, 18 editors did successfully select the 2015 winner of the Best Comparative Study Prize. All studies have limitations, but from a statistical standpoint, the editors believe that the conclusions of the winning study are likely correct. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Chew, Mabel; Villanueva, Elmer V; Van Der Weyden, Martin B
2007-01-01
Objective (1) To analyse trends in the journal impact factor (IF) of seven general medical journals (Ann Intern Med, BMJ, CMAJ, JAMA, Lancet, Med J Aust and N Engl J Med) over 12 years; and (2) to ascertain the views of these journals' past and present Editors on factors that had affected their journals' IFs during their tenure, including direct editorial policies. Design Retrospective analysis of IF data from ISI Web of Knowledge Journal Citation Reports—Science Edition, 1994 to 2005, and interviews with Editors-in-Chief. Setting Medical journal publishing. Participants Ten Editors-in-Chief of the journals, except Med J Aust, who served between 1999 and 2004. Main outcome measures IFs and component numerator and denominator data for the seven general medical journals (1994 to 2005) were collected. IFs are calculated using the formula: (Citations in year z to articles published in years x and y)/(Number of citable articles published in years x and y), where z is the current year and x and y are the previous two years. Editors' views on factors that had affected their journals' IFs were also obtained. Results IFs generally rose over the 12-year period, with the N Engl J Med having the highest IF throughout. However, percentage rises in IF relative to the baseline year of 1994 were greatest for CMAJ (about 500%) and JAMA (260%). Numerators for most journals tended to rise over this period, while denominators tended to be stable or to fall, although not always in a linear fashion. Nine of ten eligible editors were interviewed. Possible reasons given for rises in citation counts included: active recruitment of high-impact articles by courting researchers; offering authors better services; boosting the journal's media profile; more careful article selection; and increases in article citations. Most felt that going online had not affected citations. Most had no deliberate policy to publish fewer articles (lowering the IF denominator), which was sometimes the unintended result of other editorial policies. The two Editors who deliberately published fewer articles did so as they realized IFs were important to authors. Concerns about the accuracy of ISI counting for the IF denominator prompted some to routinely check their IF data with ISI. All Editors had mixed feelings about using IFs to evaluate journals and academics, and mentioned the tension between aiming to improve IFs and ‘keeping their constituents [clinicians] happy.’ Conclusions IFs of the journals studied rose in the 12-year period due to rising numerators and/or falling denominators, to varying extents. Journal Editors perceived that this occurred for various reasons, including deliberate editorial practices. The vulnerability of the IF to editorial manipulation and Editors' dissatisfaction with it as the sole measure of journal quality lend weight to the need for complementary measures. PMID:17339310