McStas 1.1: a tool for building neutron Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Lefmann, K.; Nielsen, K.; Tennant, A.; Lake, B.
2000-03-01
McStas is a project to develop general tools for the creation of simulations of neutron scattering experiments. In this paper, we briefly introduce McStas and describe a particular application of the program: the Monte Carlo calculation of the resolution function of a standard triple-axis neutron scattering instrument. The method compares well with the analytical calculations of Popovici.
Monte Carlo simulations of neutron-scattering instruments using McStas
NASA Astrophysics Data System (ADS)
Nielsen, K.; Lefmann, K.
2000-06-01
Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Risø National Laboratory, includes an extension language that makes it easy to adapt it to the particular requirements of individual instruments, and thus provides a powerful and flexible tool for constructing such simulations. McStas has been successfully applied in such areas as neutron guide design, flux optimization, non-Gaussian resolution functions of triple-axis spectrometers, and time-focusing in time-of-flight instruments.
New developments in the McStas neutron instrument simulation package
NASA Astrophysics Data System (ADS)
Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-07-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.
NASA Astrophysics Data System (ADS)
Guerra, Pedro; Udías, José M.; Herranz, Elena; Santos-Miranda, Juan Antonio; Herraiz, Joaquín L.; Valdivieso, Manlio F.; Rodríguez, Raúl; Calama, Juan A.; Pascau, Javier; Calvo, Felipe A.; Illana, Carlos; Ledesma-Carbayo, María J.; Santos, Andrés
2014-12-01
This work analysed the feasibility of using a fast, customized Monte Carlo (MC) method to perform accurate computation of dose distributions during pre- and intraplanning of intraoperative electron radiation therapy (IOERT) procedures. The MC method that was implemented, which has been integrated into a specific innovative simulation and planning tool, is able to simulate the fate of thousands of particles per second, and it was the aim of this work to determine the level of interactivity that could be achieved. The planning workflow enabled calibration of the imaging and treatment equipment, as well as manipulation of the surgical frame and insertion of the protection shields around the organs at risk and other beam modifiers. In this way, the multidisciplinary team involved in IOERT has all the tools necessary to perform complex MC dosage simulations adapted to their equipment in an efficient and transparent way. To assess the accuracy and reliability of this MC technique, dose distributions for a monoenergetic source were compared with those obtained using a general-purpose software package used widely in medical physics applications. Once accuracy of the underlying simulator was confirmed, a clinical accelerator was modelled and experimental measurements in water were conducted. A comparison was made with the output from the simulator to identify the conditions under which accurate dose estimations could be obtained in less than 3 min, which is the threshold imposed to allow for interactive use of the tool in treatment planning. Finally, a clinically relevant scenario, namely early-stage breast cancer treatment, was simulated with pre- and intraoperative volumes to verify that it was feasible to use the MC tool intraoperatively and to adjust dose delivery based on the simulation output, without compromising accuracy. The workflow provided a satisfactory model of the treatment head and the imaging system, enabling proper configuration of the treatment planning system and providing good accuracy in the dosage simulation.
Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework
Dunkerley, David A. P.; Tomkowiak, Michael T.; Slagowski, Jordan M.; McCabe, Bradley P.; Funk, Tobias; Speidel, Michael A.
2015-01-01
Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8–6.4% (18.6–31.5 cm acrylic, 100 kV), versus 2.1–4.5% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems. PMID:26113765
Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework.
Dunkerley, David A P; Tomkowiak, Michael T; Slagowski, Jordan M; McCabe, Bradley P; Funk, Tobias; Speidel, Michael A
2015-02-21
Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8-6.4% (18.6-31.5 cm acrylic, 100 kV), versus 2.1-4.5% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems.
Enhanced Master Controller Unit Tester
NASA Technical Reports Server (NTRS)
Benson, Patricia; Johnson, Yvette; Johnson, Brian; Williams, Philip; Burton, Geoffrey; McCoy, Anthony
2007-01-01
The Enhanced Master Controller Unit Tester (EMUT) software is a tool for development and testing of software for a master controller (MC) flight computer. The primary function of the EMUT software is to simulate interfaces between the MC computer and external analog and digital circuitry (including other computers) in a rack of equipment to be used in scientific experiments. The simulations span the range of nominal, off-nominal, and erroneous operational conditions, enabling the testing of MC software before all the equipment becomes available.
Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross
2016-06-01
To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dpetstep images and noise properties agreed better with MC. The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.
Development of a polarized neutron beam line at Algerian research reactors using McStas software
NASA Astrophysics Data System (ADS)
Makhloufi, M.; Salah, H.
2017-02-01
Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A
2014-01-01
The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.
McStas 1.7 - a new version of the flexible Monte Carlo neutron scattering package
NASA Astrophysics Data System (ADS)
Willendrup, Peter; Farhi, Emmanuel; Lefmann, Kim
2004-07-01
Current neutron instrumentation is both complex and expensive, and accurate simulation has become essential both for building new instruments and for using them effectively. The McStas neutron ray-trace simulation package is a versatile tool for producing such simulations, developed in collaboration between Risø and ILL. The new version (1.7) has many improvements, among these added support for the popular Microsoft Windows platform. This presentation will demonstrate a selection of the new features through a simulation of the ILL IN6 beamline.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Häggström, Ida, E-mail: haeggsti@mskcc.org; Beattie, Bradley J.; Schmidtlein, C. Ross
2016-06-15
Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuationmore » are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dPETSTEP can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.« less
Fiorina, E; Ferrero, V; Pennazio, F; Baroni, G; Battistoni, G; Belcari, N; Cerello, P; Camarlinghi, N; Ciocca, M; Del Guerra, A; Donetti, M; Ferrari, A; Giordanengo, S; Giraudo, G; Mairani, A; Morrocchi, M; Peroni, C; Rivetti, A; Da Rocha Rolo, M D; Rossi, S; Rosso, V; Sala, P; Sportelli, G; Tampellini, S; Valvo, F; Wheadon, R; Bisogni, M G
2018-05-07
Hadrontherapy is a method for treating cancer with very targeted dose distributions and enhanced radiobiological effects. To fully exploit these advantages, in vivo range monitoring systems are required. These devices measure, preferably during the treatment, the secondary radiation generated by the beam-tissue interactions. However, since correlation of the secondary radiation distribution with the dose is not straightforward, Monte Carlo (MC) simulations are very important for treatment quality assessment. The INSIDE project constructed an in-beam PET scanner to detect signals generated by the positron-emitting isotopes resulting from projectile-target fragmentation. In addition, a FLUKA-based simulation tool was developed to predict the corresponding reference PET images using a detailed scanner model. The INSIDE in-beam PET was used to monitor two consecutive proton treatment sessions on a patient at the Italian Center for Oncological Hadrontherapy (CNAO). The reconstructed PET images were updated every 10 s providing a near real-time quality assessment. By half-way through the treatment, the statistics of the measured PET images were already significant enough to be compared with the simulations with average differences in the activity range less than 2.5 mm along the beam direction. Without taking into account any preferential direction, differences within 1 mm were found. In this paper, the INSIDE MC simulation tool is described and the results of the first in vivo agreement evaluation are reported. These results have justified a clinical trial, in which the MC simulation tool will be used on a daily basis to study the compliance tolerances between the measured and simulated PET images. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Prettyman, T. H.; Gardner, R. P.; Verghese, K.
1993-08-01
A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.
Assessing the convergence of LHS Monte Carlo simulations of wastewater treatment models.
Benedetti, Lorenzo; Claeys, Filip; Nopens, Ingmar; Vanrolleghem, Peter A
2011-01-01
Monte Carlo (MC) simulation appears to be the only currently adopted tool to estimate global sensitivities and uncertainties in wastewater treatment modelling. Such models are highly complex, dynamic and non-linear, requiring long computation times, especially in the scope of MC simulation, due to the large number of simulations usually required. However, no stopping rule to decide on the number of simulations required to achieve a given confidence in the MC simulation results has been adopted so far in the field. In this work, a pragmatic method is proposed to minimize the computation time by using a combination of several criteria. It makes no use of prior knowledge about the model, is very simple, intuitive and can be automated: all convenient features in engineering applications. A case study is used to show an application of the method, and the results indicate that the required number of simulations strongly depends on the model output(s) selected, and on the type and desired accuracy of the analysis conducted. Hence, no prior indication is available regarding the necessary number of MC simulations, but the proposed method is capable of dealing with these variations and stopping the calculations after convergence is reached.
Kim, Sangroh; Yoshizumi, Terry T; Toncheva, Greta; Frush, Donald P; Yin, Fang-Fang
2010-03-01
The purpose of this study was to establish a dose estimation tool with Monte Carlo (MC) simulations. A 5-y-old paediatric anthropomorphic phantom was computed tomography (CT) scanned to create a voxelised phantom and used as an input for the abdominal cone-beam CT in a BEAMnrc/EGSnrc MC system. An X-ray tube model of the Varian On-Board Imager((R)) was built in the MC system. To validate the model, the absorbed doses at each organ location for standard-dose and low-dose modes were measured in the physical phantom with MOSFET detectors; effective doses were also calculated. In the results, the MC simulations were comparable to the MOSFET measurements. This voxelised phantom approach could produce a more accurate dose estimation than the stylised phantom method. This model can be easily applied to multi-detector CT dosimetry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodrigues, A; Wu, Q; Sawkey, D
Purpose: DEAR is a radiation therapy technique utilizing synchronized motion of gantry and couch during delivery to optimize dose distribution homogeneity and penumbra for treatment of superficial disease. Dose calculation for DEAR is not yet supported by commercial TPSs. The purpose of this study is to demonstrate the feasibility of using a web-based Monte Carlo (MC) simulation tool (VirtuaLinac) to calculate dose distributions for a DEAR delivery. Methods: MC simulations were run through VirtuaLinac, which is based on the GEANT4 platform. VirtuaLinac utilizes detailed linac head geometry and material models, validated phase space files, and a voxelized phantom. The inputmore » was expanded to include an XML file for simulation of varying mechanical axes as a function of MU. A DEAR XML plan was generated and used in the MC simulation and delivered on a TrueBeam in Developer Mode. Radiographic film wrapped on a cylindrical phantom (12.5 cm radius) measured dose at a depth of 1.5 cm and compared to the simulation results. Results: A DEAR plan was simulated using an energy of 6 MeV and a 3×10 cm{sup 2} cut-out in a 15×15 cm{sup 2} applicator for a delivery of a 90° arc. The resulting data were found to provide qualitative and quantitative evidence that the simulation platform could be used as the basis for DEAR dose calculations. The resulting unwrapped 2D dose distributions agreed well in the cross-plane direction along the arc, with field sizes of 18.4 and 18.2 cm and penumbrae of 1.9 and 2.0 cm for measurements and simulations, respectively. Conclusion: Preliminary feasibility of a DEAR delivery using a web-based MC simulation platform has been demonstrated. This tool will benefit treatment planning for DEAR as a benchmark for developing other model based algorithms, allowing efficient optimization of trajectories, and quality assurance of plans without the need for extensive measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Y; Southern Medical University, Guangzhou; Tian, Z
Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source.more » After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle transport simulations.« less
Souris, Kevin; Lee, John Aldo; Sterpin, Edmond
2016-04-01
Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
Dosimetry applications in GATE Monte Carlo toolkit.
Papadimitroulas, Panagiotis
2017-09-01
Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-08-01
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
Cros, Maria; Joemai, Raoul M S; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-07-17
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
Kern, Christoph
2016-03-23
This report describes two software tools that, when used as front ends for the three-dimensional backward Monte Carlo atmospheric-radiative-transfer model (RTM) McArtim, facilitate the generation of lookup tables of volcanic-plume optical-transmittance characteristics in the ultraviolet/visible-spectral region. In particular, the differential optical depth and derivatives thereof (that is, weighting functions), with regard to a change in SO2 column density or aerosol optical thickness, can be simulated for a specific measurement geometry and a representative range of plume conditions. These tables are required for the retrieval of SO2 column density in volcanic plumes, using the simulated radiative-transfer/differential optical-absorption spectroscopic (SRT-DOAS) approach outlined by Kern and others (2012). This report, together with the software tools published online, is intended to make this sophisticated SRT-DOAS technique available to volcanologists and gas geochemists in an operational environment, without the need for an indepth treatment of the underlying principles or the low-level interface of the RTM McArtim.
Designing new guides and instruments using McStas
NASA Astrophysics Data System (ADS)
Farhi, E.; Hansen, T.; Wildes, A.; Ghosh, R.; Lefmann, K.
With the increasing complexity of modern neutron-scattering instruments, the need for powerful tools to optimize their geometry and physical performances (flux, resolution, divergence, etc.) has become essential. As the usual analytical methods reach their limit of validity in the description of fine effects, the use of Monte Carlo simulations, which can handle these latter, has become widespread. The McStas program was developed at Riso National Laboratory in order to provide neutron scattering instrument scientists with an efficient and flexible tool for building Monte Carlo simulations of guides, neutron optics and instruments [1]. To date, the McStas package has been extensively used at the Institut Laue-Langevin, Grenoble, France, for various studies including cold and thermal guides with ballistic geometry, diffractometers, triple-axis, backscattering and time-of-flight spectrometers [2]. In this paper, we present some simulation results concerning different guide geometries that may be used in the future at the Institut Laue-Langevin. Gain factors ranging from two to five may be obtained for the integrated intensities, depending on the exact geometry, the guide coatings and the source.
Seng, Bunrith; Kaneko, Hidehiro; Hirayama, Kimiaki; Katayama-Hirayama, Keiko
2012-01-01
This paper presents a mathematical model of vertical water movement and a performance evaluation of the model in static pile composting operated with neither air supply nor turning. The vertical moisture content (MC) model was developed with consideration of evaporation (internal and external evaporation), diffusion (liquid and vapour diffusion) and percolation, whereas additional water from substrate decomposition and irrigation was not taken into account. The evaporation term in the model was established on the basis of reference evaporation of the materials at known temperature, MC and relative humidity of the air. Diffusion of water vapour was estimated as functions of relative humidity and temperature, whereas diffusion of liquid water was empirically obtained from experiment by adopting Fick's law. Percolation was estimated by following Darcy's law. The model was applied to a column of composting wood chips with an initial MC of 60%. The simulation program was run for four weeks with calculation span of 1 s. The simulated results were in reasonably good agreement with the experimental results. Only a top layer (less than 20 cm) had a considerable MC reduction; the deeper layers were comparable to the initial MC, and the bottom layer was higher than the initial MC. This model is a useful tool to estimate the MC profile throughout the composting period, and could be incorporated into biodegradation kinetic simulation of composting.
Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul
2014-01-01
As part of an ongoing effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin, analyses and simulations of the hydrology of the Edisto River Basin were made using the topography-based hydrological model (TOPMODEL). A primary focus of the investigation was to assess the potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River Basin. Scaling up was done in a step-wise manner, beginning with applying the calibration parameters, meteorological data, and topographic-wetness-index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made for subsequent simulations, culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River Basin and updated calibration parameters for some of the TOPMODEL calibration parameters. The scaling-up process resulted in nine simulations being made. Simulation 7 best matched the streamflows at station 02175000, Edisto River near Givhans, SC, which was the downstream limit for the TOPMODEL setup, and was obtained by adjusting the scaling factor, including streamflow routing, and using NEXRAD precipitation data for the Edisto River Basin. The Nash-Sutcliffe coefficient of model-fit efficiency and Pearson’s correlation coefficient for simulation 7 were 0.78 and 0.89, respectively. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the McTier Creek and Edisto River models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the substantial difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variable in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD–H, and LOADEST. Because the focus of this investigation was on scaling up the models from McTier Creek, water-quality concentrations that were previously collected in the McTier Creek Basin were used in the water-quality load models.
Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach
NASA Astrophysics Data System (ADS)
Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne
We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.
Medical Simulations for Exploration Medicine
NASA Technical Reports Server (NTRS)
Reyes, David; Suresh, Rahul; Pavela, James; Urbina, Michelle; Mindock, Jennifer; Antonsen, Erik
2018-01-01
Medical simulation is a useful tool that can be used to train personnel, develop medical processes, and assist cross-disciplinary communication. Medical simulations have been used in the past at NASA for these purposes, however they are usually created ad hoc. A stepwise approach to scenario development has not previously been used. The NASA Exploration Medical Capability (ExMC) created a medical scenario development tool to test medical procedures, technologies, concepts of operation and for use in systems engineering (SE) processes.
Monte Carlo simulations in radiotherapy dosimetry.
Andreo, Pedro
2018-06-27
The use of the Monte Carlo (MC) method in radiotherapy dosimetry has increased almost exponentially in the last decades. Its widespread use in the field has converted this computer simulation technique in a common tool for reference and treatment planning dosimetry calculations. This work reviews the different MC calculations made on dosimetric quantities, like stopping-power ratios and perturbation correction factors required for reference ionization chamber dosimetry, as well as the fully realistic MC simulations currently available on clinical accelerators, detectors and patient treatment planning. Issues are raised that include the necessity for consistency in the data throughout the entire dosimetry chain in reference dosimetry, and how Bragg-Gray theory breaks down for small photon fields. Both aspects are less critical for MC treatment planning applications, but there are important constraints like tissue characterization and its patient-to-patient variability, which together with the conversion between dose-to-water and dose-to-tissue, are analysed in detail. Although these constraints are common to all methods and algorithms used in different types of treatment planning systems, they make uncertainties involved in MC treatment planning to still remain "uncertain".
NASA Astrophysics Data System (ADS)
Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer
2018-03-01
Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond
2016-04-15
Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less
SU-E-T-25: Real Time Simulator for Designing Electron Dual Scattering Foil Systems.
Carver, R; Hogstrom, K; Price, M; Leblanc, J; Harris, G
2012-06-01
To create a user friendly, accurate, real time computer simulator to facilitate the design of dual foil scattering systems for electron beams on radiotherapy accelerators. The simulator should allow for a relatively quick, initial design that can be refined and verified with subsequent Monte Carlo (MC) calculations and measurements. The simulator consists of an analytical algorithm for calculating electron fluence and a graphical user interface (GUI) C++ program. The algorithm predicts electron fluence using Fermi-Eyges multiple Coulomb scattering theory with a refined Moliere formalism for scattering powers. The simulator also estimates central-axis x-ray dose contamination from the dual foil system. Once the geometry of the beamline is specified, the simulator allows the user to continuously vary primary scattering foil material and thickness, secondary scattering foil material and Gaussian shape (thickness and sigma), and beam energy. The beam profile and x-ray contamination are displayed in real time. The simulator was tuned by comparison of off-axis electron fluence profiles with those calculated using EGSnrc MC. Over the energy range 7-20 MeV and using present foils on the Elekta radiotherapy accelerator, the simulator profiles agreed to within 2% of MC profiles from within 20 cm of the central axis. The x-ray contamination predictions matched measured data to within 0.6%. The calculation time was approximately 100 ms using a single processor, which allows for real-time variation of foil parameters using sliding bars. A real time dual scattering foil system simulator has been developed. The tool has been useful in a project to redesign an electron dual scattering foil system for one of our radiotherapy accelerators. The simulator has also been useful as an instructional tool for our medical physics graduate students. © 2012 American Association of Physicists in Medicine.
Real-time simulator for designing electron dual scattering foil systems.
Carver, Robert L; Hogstrom, Kenneth R; Price, Michael J; LeBlanc, Justin D; Pitcher, Garrett M
2014-11-08
The purpose of this work was to develop a user friendly, accurate, real-time com- puter simulator to facilitate the design of dual foil scattering systems for electron beams on radiotherapy accelerators. The simulator allows for a relatively quick, initial design that can be refined and verified with subsequent Monte Carlo (MC) calculations and measurements. The simulator also is a powerful educational tool. The simulator consists of an analytical algorithm for calculating electron fluence and X-ray dose and a graphical user interface (GUI) C++ program. The algorithm predicts electron fluence using Fermi-Eyges multiple Coulomb scattering theory with the reduced Gaussian formalism for scattering powers. The simulator also estimates central-axis and off-axis X-ray dose arising from the dual foil system. Once the geometry of the accelerator is specified, the simulator allows the user to continuously vary primary scattering foil material and thickness, secondary scat- tering foil material and Gaussian shape (thickness and sigma), and beam energy. The off-axis electron relative fluence or total dose profile and central-axis X-ray dose contamination are computed and displayed in real time. The simulator was validated by comparison of off-axis electron relative fluence and X-ray percent dose profiles with those calculated using EGSnrc MC. Over the energy range 7-20 MeV, using present foils on an Elekta radiotherapy accelerator, the simulator was able to reproduce MC profiles to within 2% out to 20 cm from the central axis. The central-axis X-ray percent dose predictions matched measured data to within 0.5%. The calculation time was approximately 100 ms using a single Intel 2.93 GHz processor, which allows for real-time variation of foil geometrical parameters using slider bars. This work demonstrates how the user-friendly GUI and real-time nature of the simulator make it an effective educational tool for gaining a better understanding of the effects that various system parameters have on a relative dose profile. This work also demonstrates a method for using the simulator as a design tool for creating custom dual scattering foil systems in the clinical range of beam energies (6-20 MeV).
Using McStas for modelling complex optics, using simple building bricks
NASA Astrophysics Data System (ADS)
Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim
2011-04-01
The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.
Scaling up watershed model parameters--Flow and load simulations of the Edisto River Basin
Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul
2014-01-01
The Edisto River is the longest and largest river system completely contained in South Carolina and is one of the longest free flowing blackwater rivers in the United States. The Edisto River basin also has fish-tissue mercury concentrations that are some of the highest recorded in the United States. As part of an effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River basin, analyses and simulations of the hydrology of the Edisto River basin were made with the topography-based hydrological model (TOPMODEL). The potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River basin, was assessed. Scaling up was done in a step-wise process beginning with applying the calibration parameters, meteorological data, and topographic wetness index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made with subsequent simulations culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River basin and updated calibration parameters for some of the TOPMODEL calibration parameters. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the two models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the significant difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variables in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD-H, and LOADEST. Because the focus of this investigation was on scaling up the models from McTier Creek, water-quality concentrations that were previously collected in the McTier Creek basin were used in the water-quality load models.
A Methodology to Assess UrbanSim Scenarios
2012-09-01
Education LOE – Line of Effort MMOG – Massively Multiplayer Online Game MC3 – Maneuver Captain’s Career Course MSCCC – Maneuver Support...augmented reality simulations, increased automation and artificial intelligence simulation, and massively multiplayer online games (MMOG), among...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Turn-based strategy games and simulations are vital tools for military
A fast and complete GEANT4 and ROOT Object-Oriented Toolkit: GROOT
NASA Astrophysics Data System (ADS)
Lattuada, D.; Balabanski, D. L.; Chesnevskaya, S.; Costa, M.; Crucillà, V.; Guardo, G. L.; La Cognata, M.; Matei, C.; Pizzone, R. G.; Romano, S.; Spitaleri, C.; Tumino, A.; Xu, Y.
2018-01-01
Present and future gamma-beam facilities represent a great opportunity to validate and evaluate the cross-sections of many photonuclear reactions at near-threshold energies. Monte Carlo (MC) simulations are very important to evaluate the reaction rates and to maximize the detection efficiency but, unfortunately, they can be very cputime-consuming and in some cases very hard to reproduce, especially when exploring near-threshold cross-section. We developed a software that makes use of the validated tracking GEANT4 libraries and the n-body event generator of ROOT in order to provide a fast, realiable and complete MC tool to be used for nuclear physics experiments. This tool is indeed intended to be used for photonuclear reactions at γ-beam facilities with ELISSA (ELI Silicon Strip Array), a new detector array under development at the Extreme Light Infrastructure - Nuclear Physics (ELI-NP). We discuss the results of MC simulations performed to evaluate the effects of the electromagnetic induced background, of the straggling due to the target thickness and of the resolution of the silicon detectors.
Instrumental resolution of the chopper spectrometer 4SEASONS evaluated by Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Kajimoto, Ryoichi; Sato, Kentaro; Inamura, Yasuhiro; Fujita, Masaki
2018-05-01
We performed simulations of the resolution function of the 4SEASONS spectrometer at J-PARC by using the Monte Carlo simulation package McStas. The simulations showed reasonably good agreement with analytical calculations of energy and momentum resolutions by using a simplified description. We implemented new functionalities in Utsusemi, the standard data analysis tool used in 4SEASONS, to enable visualization of the simulated resolution function and predict its shape for specific experimental configurations.
Fast scattering simulation tool for multi-energy x-ray imaging
NASA Astrophysics Data System (ADS)
Sossin, A.; Tabary, J.; Rebuffel, V.; Létang, J. M.; Freud, N.; Verger, L.
2015-12-01
A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.
Borzov, Egor; Daniel, Shahar; Bar‐Deroma, Raquel
2016-01-01
Total skin electron irradiation (TSEI) is a complex technique which requires many nonstandard measurements and dosimetric procedures. The purpose of this work was to validate measured dosimetry data by Monte Carlo (MC) simulations using EGSnrc‐based codes (BEAMnrc and DOSXYZnrc). Our MC simulations consisted of two major steps. In the first step, the incident electron beam parameters (energy spectrum, FWHM, mean angular spread) were adjusted to match the measured data (PDD and profile) at SSD=100 cm for an open field. In the second step, these parameters were used to calculate dose distributions at the treatment distance of 400 cm. MC simulations of dose distributions from single and dual fields at the treatment distance were performed in a water phantom. Dose distribution from the full treatment with six dual fields was simulated in a CT‐based anthropomorphic phantom. MC calculations were compared to the available set of measurements used in clinical practice. For one direct field, MC calculated PDDs agreed within 3%/1 mm with the measurements, and lateral profiles agreed within 3% with the measured data. For the OF, the measured and calculated results were within 2% agreement. The optimal angle of 17° was confirmed for the dual field setup. Dose distribution from the full treatment with six dual fields was simulated in a CT‐based anthropomorphic phantom. The MC‐calculated multiplication factor (B12‐factor), which relates the skin dose for the whole treatment to the dose from one calibration field, for setups with and without degrader was 2.9 and 2.8, respectively. The measured B12‐factor was 2.8 for both setups. The difference between calculated and measured values was within 3.5%. It was found that a degrader provides more homogeneous dose distribution. The measured X‐ray contamination for the full treatment was 0.4%; this is compared to the 0.5% X‐ray contamination obtained with the MC calculation. Feasibility of MC simulation in an anthropomorphic phantom for a full TSEI treatment was proved and is reported for the first time in the literature. The results of our MC calculations were found to be in general agreement with the measurements, providing a promising tool for further studies of dose distribution calculations in TSEI. PACS number(s): 87.10. Rt, 87.55.K, 87.55.ne PMID:27455502
FY17 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Jung, Y. S.; Smith, M. A.
2017-09-30
Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less
Radio Frequency Scanning and Simulation of Oriented Strand Board Material Property
NASA Astrophysics Data System (ADS)
Liu, Xiaojian; Zhang, Jilei; Steele, Philip. H.; Donohoe, J. Patrick
2008-02-01
Oriented strandboard (OSB) is a wood composite product with the largest market share in U.S. residential and commercial construction. Wood specific gravity (SG) and moisture content (MC) play an important role in the OSB manufacturing process. They are the two of the critical variables that manufacturers are required to monitor, locate, and control in order to produce a product with consistent quality. In this study, radio frequency scanning nondestructive evaluation (NDE) technologies evaluated the local area MC and SG of OSB panels following panel production by hot pressing. A finite element software simulation tool was used to optimize the sensor geometry and for investigating the interaction between electromagnetic field and wood dielectric properties. Our results indicate the RF scanning response is closely correlated to the MC and SG variations in OSB panels. Radio frequency NDE appears to have potential as an effective method for insuring OSB panel quality during manufacturing.
Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics
NASA Astrophysics Data System (ADS)
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
NASA Astrophysics Data System (ADS)
Jung, Hyunuk; Shin, Jungsuk; Chung, Kwangzoo; Han, Youngyih; Kim, Jinsung; Choi, Doo Ho
2015-05-01
The aim of this study was to develop an independent dose verification system by using a Monte Carlo (MC) calculation method for intensity modulated radiation therapy (IMRT) conducted by using a Varian Novalis Tx (Varian Medical Systems, Palo Alto, CA, USA) equipped with a highdefinition multi-leaf collimator (HD-120 MLC). The Geant4 framework was used to implement a dose calculation system that accurately predicted the delivered dose. For this purpose, the Novalis Tx Linac head was modeled according to the specifications acquired from the manufacturer. Subsequently, MC simulations were performed by varying the mean energy, energy spread, and electron spot radius to determine optimum values of irradiation with 6-MV X-ray beams by using the Novalis Tx system. Computed percentage depth dose curves (PDDs) and lateral profiles were compared to the measurements obtained by using an ionization chamber (CC13). To validate the IMRT simulation by using the MC model we developed, we calculated a simple IMRT field and compared the result with the EBT3 film measurements in a water-equivalent solid phantom. Clinical cases, such as prostate cancer treatment plans, were then selected, and MC simulations were performed. The accuracy of the simulation was assessed against the EBT3 film measurements by using a gamma-index criterion. The optimal MC model parameters to specify the beam characteristics were a 6.8-MeV mean energy, a 0.5-MeV energy spread, and a 3-mm electron radius. The accuracy of these parameters was determined by comparison of MC simulations with measurements. The PDDs and the lateral profiles of the MC simulation deviated from the measurements by 1% and 2%, respectively, on average. The computed simple MLC fields agreed with the EBT3 measurements with a 95% passing rate with 3%/3-mm gamma-index criterion. Additionally, in applying our model to clinical IMRT plans, we found that the MC calculations and the EBT3 measurements agreed well with a passing rate of greater than 95% on average with a 3%/3-mm gamma-index criterion. In summary, the Novalis Tx Linac head equipped with a HD-120 MLC was successfully modeled by using a Geant4 platform, and the accuracy of the Geant4 platform was successfully validated by comparisons with measurements. The MC model we have developed can be a useful tool for pretreatment quality assurance of IMRT plans and for commissioning of radiotherapy treatment planning.
SU-F-J-95: Impact of Shape Complexity On the Accuracy of Gradient-Based PET Volume Delineation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dance, M; Wu, G; Gao, Y
2016-06-15
Purpose: Explore correlation of tumor complexity shape with PET target volume accuracy when delineated with gradient-based segmentation tool. Methods: A total of 24 clinically realistic digital PET Monte Carlo (MC) phantoms of NSCLC were used in the study. The phantom simulated 29 thoracic lesions (lung primary and mediastinal lymph nodes) of varying size, shape, location, and {sup 18}F-FDG activity. A program was developed to calculate a curvature vector along the outline and the standard deviation of this vector was used as a metric to quantify a shape’s “complexity score”. This complexity score was calculated for standard geometric shapes and MC-generatedmore » target volumes in PET phantom images. All lesions were contoured using a commercially available gradient-based segmentation tool and the differences in volume from the MC-generated volumes were calculated as the measure of the accuracy of segmentation. Results: The average absolute percent difference in volumes between the MC-volumes and gradient-based volumes was 11% (0.4%–48.4%). The complexity score showed strong correlation with standard geometric shapes. However, no relationship was found between the complexity score and the accuracy of segmentation by gradient-based tool on MC simulated tumors (R{sup 2} = 0.156). When the lesions were grouped into primary lung lesions and mediastinal/mediastinal adjacent lesions, the average absolute percent difference in volumes were 6% and 29%, respectively. The former group is more isolated and the latter is more surround by tissues with relatively high SUV background. Conclusion: The complexity shape of NSCLC lesions has little effect on the accuracy of the gradient-based segmentation method and thus is not a good predictor of uncertainty in target volume delineation. Location of lesion within a relatively high SUV background may play a more significant role in the accuracy of gradient-based segmentation.« less
Chang, Hsiao-Han; Worby, Colin J.; Yeka, Adoke; Nankabirwa, Joaniter; Kamya, Moses R.; Staedke, Sarah G.; Hubbart, Christina; Amato, Roberto; Kwiatkowski, Dominic P.
2017-01-01
As many malaria-endemic countries move towards elimination of Plasmodium falciparum, the most virulent human malaria parasite, effective tools for monitoring malaria epidemiology are urgent priorities. P. falciparum population genetic approaches offer promising tools for understanding transmission and spread of the disease, but a high prevalence of multi-clone or polygenomic infections can render estimation of even the most basic parameters, such as allele frequencies, challenging. A previous method, COIL, was developed to estimate complexity of infection (COI) from single nucleotide polymorphism (SNP) data, but relies on monogenomic infections to estimate allele frequencies or requires external allele frequency data which may not available. Estimates limited to monogenomic infections may not be representative, however, and when the average COI is high, they can be difficult or impossible to obtain. Therefore, we developed THE REAL McCOIL, Turning HEterozygous SNP data into Robust Estimates of ALelle frequency, via Markov chain Monte Carlo, and Complexity Of Infection using Likelihood, to incorporate polygenomic samples and simultaneously estimate allele frequency and COI. This approach was tested via simulations then applied to SNP data from cross-sectional surveys performed in three Ugandan sites with varying malaria transmission. We show that THE REAL McCOIL consistently outperforms COIL on simulated data, particularly when most infections are polygenomic. Using field data we show that, unlike with COIL, we can distinguish epidemiologically relevant differences in COI between and within these sites. Surprisingly, for example, we estimated high average COI in a peri-urban subregion with lower transmission intensity, suggesting that many of these cases were imported from surrounding regions with higher transmission intensity. THE REAL McCOIL therefore provides a robust tool for understanding the molecular epidemiology of malaria across transmission settings. PMID:28125584
Concepts for dose determination in flat-detector CT
NASA Astrophysics Data System (ADS)
Kyriakou, Yiannis; Deak, Paul; Langner, Oliver; Kalender, Willi A.
2008-07-01
Flat-detector computed tomography (FD-CT) scanners provide large irradiation fields of typically 200 mm in the cranio-caudal direction. In consequence, dose assessment according to the current definition of the computed tomography dose index CTDIL=100 mm, where L is the integration length, would demand larger ionization chambers and phantoms which do not appear practical. We investigated the usefulness of the CTDI concept and practical dosimetry approaches for FD-CT by measurements and Monte Carlo (MC) simulations. An MC simulation tool (ImpactMC, VAMP GmbH, Erlangen, Germany) was used to assess the dose characteristics and was calibrated with measurements of air kerma. For validation purposes measurements were performed on an Axiom Artis C-arm system (Siemens Medical Solutions, Forchheim, Germany) equipped with a flat detector of 40 cm × 30 cm. The dose was assessed for 70 kV and 125 kV in cylindrical PMMA phantoms of 160 mm and 320 mm diameter with a varying phantom length from 150 to 900 mm. MC simulation results were compared to the values obtained with a calibrated ionization chambers of 100 mm and 250 mm length and to thermoluminesence (TLD) dose profiles. The MCs simulations were used to calculate the efficiency of the CTDIL determination with respect to the desired CTDI∞. Both the MC simulation results and the dose distributions obtained by MC simulation were in very good agreement with the CTDI measurements and with the reference TLD profiles, respectively, to within 5%. Standard CTDI phantoms which have a z-extent of 150 mm underestimate the dose at the center by up to 55%, whereas a z-extent of >=600 mm appears to be sufficient for FD-CT; the baseline value of the respective profile was within 1% to the reference baseline. As expected, the measurements with ionization chambers of 100 mm and 250 mm offer a limited accuracy, whereas an increased integration length of >=600 mm appeared to be necessary to approximate CTDI∞ in within 1%. MC simulations appear to offer a practical and accurate way of assessing conversion factors for arbitrary dosimetry setups using a standard pencil chamber to provide estimates of CTDI∞. This would eliminate the need for extra-long phantoms and ionization chambers or excessive amounts of TLDs.
WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNamara, A; Held, K; Paganetti, H
2016-06-15
Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less
kmos: A lattice kinetic Monte Carlo framework
NASA Astrophysics Data System (ADS)
Hoffmann, Max J.; Matera, Sebastian; Reuter, Karsten
2014-07-01
Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.
Analysis of Error Propagation Within Hierarchical Air Combat Models
2016-06-01
Model Simulation MANA Map Aware Non-Uniform Automata MCET Mine Warfare Capabilities and Effectiveness Tool MOE measure of effectiveness MOP measure of...model for a two-versus-two air engagement between jet fighters in the stochastic, agent-based Map Aware Non- uniform Automata (MANA) simulation...Master’s thesis, Naval Postgraduate School, Monterey, CA. McIntosh, G. C. (2009). MANA-V (Map aware non-uniform automata – Vector) supplementary manual
Xu, Yuan; Bai, Ti; Yan, Hao; Ouyang, Luo; Pompos, Arnold; Wang, Jing; Zhou, Linghong; Jiang, Steve B.; Jia, Xun
2015-01-01
Cone-beam CT (CBCT) has become the standard image guidance tool for patient setup in image-guided radiation therapy. However, due to its large illumination field, scattered photons severely degrade its image quality. While kernel-based scatter correction methods have been used routinely in the clinic, it is still desirable to develop Monte Carlo (MC) simulation-based methods due to their accuracy. However, the high computational burden of the MC method has prevented routine clinical application. This paper reports our recent development of a practical method of MC-based scatter estimation and removal for CBCT. In contrast with conventional MC approaches that estimate scatter signals using a scatter-contaminated CBCT image, our method used a planning CT image for MC simulation, which has the advantages of accurate image intensity and absence of image truncation. In our method, the planning CT was first rigidly registered with the CBCT. Scatter signals were then estimated via MC simulation. After scatter signals were removed from the raw CBCT projections, a corrected CBCT image was reconstructed. The entire workflow was implemented on a GPU platform for high computational efficiency. Strategies such as projection denoising, CT image downsampling, and interpolation along the angular direction were employed to further enhance the calculation speed. We studied the impact of key parameters in the workflow on the resulting accuracy and efficiency, based on which the optimal parameter values were determined. Our method was evaluated in numerical simulation, phantom, and real patient cases. In the simulation cases, our method reduced mean HU errors from 44 HU to 3 HU and from 78 HU to 9 HU in the full-fan and the half-fan cases, respectively. In both the phantom and the patient cases, image artifacts caused by scatter, such as ring artifacts around the bowtie area, were reduced. With all the techniques employed, we achieved computation time of less than 30 sec including the time for both the scatter estimation and CBCT reconstruction steps. The efficacy of our method and its high computational efficiency make our method attractive for clinical use. PMID:25860299
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiebe, J; Department of Physics and Astronomy, University of Calgary, Calgary, AB; Ploquin, N
2014-08-15
Monte Carlo (MC) simulation is accepted as the most accurate method to predict dose deposition when compared to other methods in radiation treatment planning. Current dose calculation algorithms used for treatment planning can become inaccurate when small radiation fields and tissue inhomogeneities are present. At our centre the Novalis Classic linear accelerator (linac) is used for Stereotactic Radiosurgery (SRS). The first MC model to date of the Novalis Classic linac was developed at our centre using the Geant4 Application for Tomographic Emission (GATE) simulation platform. GATE is relatively new, open source MC software built from CERN's Geometry and Tracking 4more » (Geant4) toolkit. The linac geometry was modeled using manufacturer specifications, as well as in-house measurements of the micro MLC's. Among multiple model parameters, the initial electron beam was adjusted so that calculated depth dose curves agreed with measured values. Simulations were run on the European Grid Infrastructure through GateLab. Simulation time is approximately 8 hours on GateLab for a complete head model simulation to acquire a phase space file. Current results have a majority of points within 3% of the measured dose values for square field sizes ranging from 6×6 mm{sup 2} to 98×98 mm{sup 2} (maximum field size on the Novalis Classic linac) at 100 cm SSD. The x-ray spectrum was determined from the MC data as well. The model provides an investigation into GATE'S capabilities and has the potential to be used as a research tool and an independent dose calculation engine for clinical treatment plans.« less
European Space Agency (ESA) Mission Specialist Nicollier trains in JSC's WETF
NASA Technical Reports Server (NTRS)
1987-01-01
European Space Agency (ESA) Mission Specialist (MS) Claude Nicollier (left) is briefed by Randall S. McDaniel on Space Shuttle extravehicular activity (EVA) tools and equipment prior to donning an extravehicular mobility unit and participating in an underwater EVA simulation in JSC's Weightless Environment Training Facility (WETF) Bldg 29 pool. Nicollier is holding the EMU mini workstation. Other equipment on the table includes EVA tool caddies and EVA crewmember safety tethers.
NASA Astrophysics Data System (ADS)
Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.
2002-03-01
Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.
Space Object Collision Probability via Monte Carlo on the Graphics Processing Unit
NASA Astrophysics Data System (ADS)
Vittaldev, Vivek; Russell, Ryan P.
2017-09-01
Fast and accurate collision probability computations are essential for protecting space assets. Monte Carlo (MC) simulation is the most accurate but computationally intensive method. A Graphics Processing Unit (GPU) is used to parallelize the computation and reduce the overall runtime. Using MC techniques to compute the collision probability is common in literature as the benchmark. An optimized implementation on the GPU, however, is a challenging problem and is the main focus of the current work. The MC simulation takes samples from the uncertainty distributions of the Resident Space Objects (RSOs) at any time during a time window of interest and outputs the separations at closest approach. Therefore, any uncertainty propagation method may be used and the collision probability is automatically computed as a function of RSO collision radii. Integration using a fixed time step and a quartic interpolation after every Runge Kutta step ensures that no close approaches are missed. Two orders of magnitude speedups over a serial CPU implementation are shown, and speedups improve moderately with higher fidelity dynamics. The tool makes the MC approach tractable on a single workstation, and can be used as a final product, or for verifying surrogate and analytical collision probability methods.
Simulation of streamflow in the McTier Creek watershed, South Carolina
Feaster, Toby D.; Golden, Heather E.; Odom, Kenneth R.; Lowery, Mark A.; Conrads, Paul; Bradley, Paul M.
2010-01-01
The McTier Creek watershed is located in the Sand Hills ecoregion of South Carolina and is a small catchment within the Edisto River Basin. Two watershed hydrology models were applied to the McTier Creek watershed as part of a larger scientific investigation to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin. The two models are the topography-based hydrological model (TOPMODEL) and the grid-based mercury model (GBMM). TOPMODEL uses the variable-source area concept for simulating streamflow, and GBMM uses a spatially explicit modified curve-number approach for simulating streamflow. The hydrologic output from TOPMODEL can be used explicitly to simulate the transport of mercury in separate applications, whereas the hydrology output from GBMM is used implicitly in the simulation of mercury fate and transport in GBMM. The modeling efforts were a collaboration between the U.S. Geological Survey and the U.S. Environmental Protection Agency, National Exposure Research Laboratory. Calibrations of TOPMODEL and GBMM were done independently while using the same meteorological data and the same period of record of observed data. Two U.S. Geological Survey streamflow-gaging stations were available for comparison of observed daily mean flow with simulated daily mean flow-station 02172300, McTier Creek near Monetta, South Carolina, and station 02172305, McTier Creek near New Holland, South Carolina. The period of record at the Monetta gage covers a broad range of hydrologic conditions, including a drought and a significant wet period. Calibrating the models under these extreme conditions along with the normal flow conditions included in the record enhances the robustness of the two models. Several quantitative assessments of the goodness of fit between model simulations and the observed daily mean flows were done. These included the Nash-Sutcliffe coefficient of model-fit efficiency index, Pearson's correlation coefficient, the root mean square error, the bias, and the mean absolute error. In addition, a number of graphical tools were used to assess how well the models captured the characteristics of the observed data at the Monetta and New Holland streamflow-gaging stations. The graphical tools included temporal plots of simulated and observed daily mean flows, flow-duration curves, single-mass curves, and various residual plots. The results indicated that TOPMODEL and GBMM generally produced simulations that reasonably capture the quantity, variability, and timing of the observed streamflow. For the periods modeled, the total volume of simulated daily mean flows as compared to the total volume of the observed daily mean flow from TOPMODEL was within 1 to 5 percent, and the total volume from GBMM was within 1 to 10 percent. A noticeable characteristic of the simulated hydrographs from both models is the complexity of balancing groundwater recession and flow at the streamgage when flows peak and recede rapidly. However, GBMM results indicate that groundwater recession, which affects the receding limb of the hydrograph, was more difficult to estimate with the spatially explicit curve number approach. Although the purpose of this report is not to directly compare both models, given the characteristics of the McTier Creek watershed and the fact that GBMM uses the spatially explicit curve number approach as compared to the variable-source-area concept in TOPMODEL, GBMM was able to capture the flow characteristics reasonably well.
Subtle Monte Carlo Updates in Dense Molecular Systems.
Bottaro, Sandro; Boomsma, Wouter; E Johansson, Kristoffer; Andreetta, Christian; Hamelryck, Thomas; Ferkinghoff-Borg, Jesper
2012-02-14
Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce a kinetic algorithm, CRISP, that greatly enhances the sampling efficiency in all-atom MC simulations of dense systems. The algorithm is based on an exact analytical solution to the classic chain-closure problem, making it possible to express the interdependencies among degrees of freedom in the molecule as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results suggest our method as a valuable tool in the study of molecules in atomic detail, offering a potential alternative to molecular dynamics for probing long time-scale conformational transitions.
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
D'Amours, Michel; Pouliot, Jean; Dagnault, Anne; Verhaegen, Frank; Beaulieu, Luc
2011-12-01
Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report 43 approach for the Axxent source. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.
2014-10-01
Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.
Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M
2016-12-01
A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dias, M F; Department of Radiation Oncology, Francis H. Burr Proton Therapy Center Massachusetts General Hospital; Seco, J
Purpose: Research in carbon imaging has been growing over the past years, as a way to increase treatment accuracy and patient positioning in carbon therapy. The purpose of this tool is to allow a fast and flexible way to generate CDRR data without the need to use Monte Carlo (MC) simulations. It can also be used to predict future clinically measured data. Methods: A python interface has been developed, which uses information from CT or 4DCT and thetreatment calibration curve to compute the Water Equivalent Path Length (WEPL) of carbon ions. A GPU based ray tracing algorithm computes the WEPLmore » of each individual carbon traveling through the CT voxels. A multiple peak detection method to estimate high contrast margin positioning has been implemented (described elsewhere). MC simulations have been used to simulate carbons depth dose curves in order to simulate the response of a range detector. Results: The tool allows the upload of CT or 4DCT images. The user has the possibility to selectphase/slice of interested as well as position, angle…). The WEPL is represented as a range detector which can be used to assess range dilution and multiple peak detection effects. The tool also provides knowledge of the minimum energy that should be considered for imaging purposes. The multiple peak detection method has been used in a lung tumor case, showing an accuracy of 1mm in determine the exact interface position. Conclusion: The tool offers an easy and fast way to simulate carbon imaging data. It can be used for educational and for clinical purposes, allowing the user to test beam energies and angles before real acquisition. An analysis add-on is being developed, where the used will have the opportunity to select different reconstruction methods and detector types (range or energy). Fundacao para a Ciencia e a Tecnologia (FCT), PhD Grant number SFRH/BD/85749/2012.« less
Investigation of Advanced Dose Verification Techniques for External Beam Radiation Treatment
NASA Astrophysics Data System (ADS)
Asuni, Ganiyu Adeniyi
Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) have been introduced in radiation therapy to achieve highly conformal dose distributions around the tumour while minimizing dose to surrounding normal tissues. These techniques have increased the need for comprehensive quality assurance tests, to verify that customized patient treatment plans are accurately delivered during treatment. in vivo dose verification, performed during treatment delivery, confirms that the actual dose delivered is the same as the prescribed dose, helping to reduce treatment delivery errors. in vivo measurements may be accomplished using entrance or exit detectors. The objective of this project is to investigate a novel entrance detector designed for in vivo dose verification. This thesis is separated into three main investigations, focusing on a prototype entrance transmission detector (TRD) developed by IBA Dosimetry, Germany. First contaminant electrons generated by the TRD in a 6 MV photon beam were investigated using Monte Carlo (MC) simulation. This study demonstrates that modification of the contaminant electron model in the treatment planning system is required for accurate patient dose calculation in buildup regions when using the device. Second, the ability of the TRD to accurately measure dose from IMRT and VMAT was investigated by characterising the spatial resolution of the device. This was accomplished by measuring the point spread function with further validation provided by MC simulation. Comparisons of measured and calculated doses show that the spatial resolution of the TRD allows for measurement of clinical IMRT fields within acceptable tolerance. Finally, a new general research tool was developed to perform MC simulations for VMAT and IMRT treatments, simultaneously tracking dose deposition in both the patient CT geometry and an arbitrary planar detector system, generalized to handle either entrance or exit orientations. It was demonstrated that the tool accurately simulates dose to the patient CT and planar detector geometries. The tool has been made freely available to the medical physics research community to help advance the development of in vivo planar detectors. In conclusion, this thesis presents several investigations that improve the understanding of a novel entrance detector designed for patient in vivo dosimetry.
NASA Astrophysics Data System (ADS)
Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.
2017-09-01
Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.
The Ultimate Monte Carlo: Studying Cross-Sections With Cosmic Rays
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.
2007-01-01
The high-energy physics community has been discussing for years the need to bring together the three principal disciplines that study hadron cross-section physics - ground-based accelerators, cosmic-ray experiments in space, and air shower research. Only recently have NASA investigators begun discussing the use of space-borne cosmic-ray payloads to bridge the gap between accelerator physics and air shower work using cosmic-ray measurements. The common tool used in these three realms of high-energy hadron physics is the Monte Carlo (MC). Yet the obvious has not been considered - using a single MC for simulating the entire relativistic energy range (GeV to EeV). The task is daunting due to large uncertainties in accelerator, space, and atmospheric cascade measurements. These include inclusive versus exclusive cross-section measurements, primary composition, interaction dynamics, and possible new physics beyond the standard model. However, the discussion of a common tool or ultimate MC might be the very thing that could begin to unify these independent groups into a common purpose. The Offline ALICE concept of a Virtual MC at CERN s Large Hadron Collider (LHC) will be discussed as a rudimentary beginning of this idea, and as a possible forum for carrying it forward in the future as LHC data emerges.
Monte Carlo simulations of backscattering process in dislocation-containing SrTiO3 single crystal
NASA Astrophysics Data System (ADS)
Jozwik, P.; Sathish, N.; Nowicki, L.; Jagielski, J.; Turos, A.; Kovarik, L.; Arey, B.
2014-05-01
Studies of defects formation in crystals are of obvious importance in electronics, nuclear engineering and other disciplines where materials are exposed to different forms of irradiation. Rutherford Backscattering/Channeling (RBS/C) and Monte Carlo (MC) simulations are the most convenient tool for this purpose, as they allow one to determine several features of lattice defects: their type, concentration and damage accumulation kinetic. On the other hand various irradiation conditions can be efficiently modeled by ion irradiation method without leading to the radioactivity of the sample. Combination of ion irradiation with channeling experiment and MC simulations appears thus as a most versatile method in studies of radiation damage in materials. The paper presents the results on such a study performed on SrTiO3 (STO) single crystals irradiated with 320 keV Ar ions. The samples were analyzed also by using HRTEM as a complementary method which enables the measurement of geometrical parameters of crystal lattice deformation in the vicinity of dislocations. Once the parameters and their variations within the distance of several lattice constants from the dislocation core are known, they may be used in MC simulations for the quantitative determination of dislocation depth distribution profiles. The final outcome of the deconvolution procedure are cross-sections values calculated for two types of defects observed (RDA and dislocations).
NASA Technical Reports Server (NTRS)
Sud, Y. C.; Lee, D.; Oreopoulos, L.; Barahona, D.; Nenes, A.; Suarez, M. J.
2012-01-01
A revised version of the Microphysics of clouds with Relaxed Arakawa-Schubert and Aerosol-Cloud interaction (McRAS-AC), including, among others, the Barahona and Nenes ice nucleation parameterization, is implemented in the GEOS-5 AGCM. Various fields from a 10-year long integration of the AGCM with McRAS-AC were compared with their counterparts from an integration of the baseline GEOS-5 AGCM, and with satellite data as observations. Generally using McRAS-AC reduced biases in cloud fields and cloud radiative effects are much better over most of the regions of the Earth. Two weaknesses are identified in the McRAS-AC runs, namely, too few cloud particles around 40S-60S, and too high cloud water path during northern hemisphere summer over the Gulf Stream and North Pacific. Sensitivity analyses showed that these biases potentially originated from biases in the aerosol input. The first bias is largely eliminated in a sensitivity test using 50% smaller aerosol particles, while the second bias is much reduced when interactive aerosol chemistry was turned on. The main drawback of McRAS-AC is dearth of low-level marine stratus clouds, probably due to lack of dry-convection, not yet implemented into the cloud scheme. Despite these biases, McRAS-AC does simulate realistic clouds and their optical properties that can improve with better aerosol-input and thereby has the potential to be a valuable tool for climate modeling research because of its aerosol indirect effect simulation capabilities involving prediction of cloud particle number concentration and effective particle size for both convective and stratiform clouds is quite realistic.
Acceleration of Monte Carlo SPECT simulation using convolution-based forced detection
NASA Astrophysics Data System (ADS)
de Jong, H. W. A. M.; Slijpen, E. T. P.; Beekman, F. J.
2001-02-01
Monte Carlo (MC) simulation is an established tool to calculate photon transport through tissue in Emission Computed Tomography (ECT). Since the first appearance of MC a large variety of variance reduction techniques (VRT) have been introduced to speed up these notoriously slow simulations. One example of a very effective and established VRT is known as forced detection (FD). In standard FD the path from the photon's scatter position to the camera is chosen stochastically from the appropriate probability density function (PDF), modeling the distance-dependent detector response. In order to speed up MC the authors propose a convolution-based FD (CFD) which involves replacing the sampling of the PDF by a convolution with a kernel which depends on the position of the scatter event. The authors validated CFD for parallel-hole Single Photon Emission Computed Tomography (SPECT) using a digital thorax phantom. Comparison of projections estimated with CFD and standard FD shows that both estimates converge to practically identical projections (maximum bias 0.9% of peak projection value), despite the slightly different photon paths used in CFD and standard FD. Projections generated with CFD converge, however, to a noise-free projection up to one or two orders of magnitude faster, which is extremely useful in many applications such as model-based image reconstruction.
NASA Astrophysics Data System (ADS)
Liu, Shaoying; King, Michael A.; Brill, Aaron B.; Stabin, Michael G.; Farncombe, Troy H.
2008-02-01
Monte Carlo (MC) is a well-utilized tool for simulating photon transport in single photon emission computed tomography (SPECT) due to its ability to accurately model physical processes of photon transport. As a consequence of this accuracy, it suffers from a relatively low detection efficiency and long computation time. One technique used to improve the speed of MC modeling is the effective and well-established variance reduction technique (VRT) known as forced detection (FD). With this method, photons are followed as they traverse the object under study but are then forced to travel in the direction of the detector surface, whereby they are detected at a single detector location. Another method, called convolution-based forced detection (CFD), is based on the fundamental idea of FD with the exception that detected photons are detected at multiple detector locations and determined with a distance-dependent blurring kernel. In order to further increase the speed of MC, a method named multiple projection convolution-based forced detection (MP-CFD) is presented. Rather than forcing photons to hit a single detector, the MP-CFD method follows the photon transport through the object but then, at each scatter site, forces the photon to interact with a number of detectors at a variety of angles surrounding the object. This way, it is possible to simulate all the projection images of a SPECT simulation in parallel, rather than as independent projections. The result of this is vastly improved simulation time as much of the computation load of simulating photon transport through the object is done only once for all projection angles. The results of the proposed MP-CFD method agrees well with the experimental data in measurements of point spread function (PSF), producing a correlation coefficient (r2) of 0.99 compared to experimental data. The speed of MP-CFD is shown to be about 60 times faster than a regular forced detection MC program with similar results.
Perfetti, Christopher M.; Rearden, Bradley T.
2016-03-01
The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less
Greco, Cristina; Jiang, Ying; Chen, Jeff Z Y; Kremer, Kurt; Daoulas, Kostas Ch
2016-11-14
Self Consistent Field (SCF) theory serves as an efficient tool for studying mesoscale structure and thermodynamics of polymeric liquid crystals (LC). We investigate how some of the intrinsic approximations of SCF affect the description of the thermodynamics of polymeric LC, using a coarse-grained model. Polymer nematics are represented as discrete worm-like chains (WLC) where non-bonded interactions are defined combining an isotropic repulsive and an anisotropic attractive Maier-Saupe (MS) potential. The range of the potentials, σ, controls the strength of correlations due to non-bonded interactions. Increasing σ (which can be seen as an increase of coarse-graining) while preserving the integrated strength of the potentials reduces correlations. The model is studied with particle-based Monte Carlo (MC) simulations and SCF theory which uses partial enumeration to describe discrete WLC. In MC simulations the Helmholtz free energy is calculated as a function of strength of MS interactions to obtain reference thermodynamic data. To calculate the free energy of the nematic branch with respect to the disordered melt, we employ a special thermodynamic integration (TI) scheme invoking an external field to bypass the first-order isotropic-nematic transition. Methodological aspects which have not been discussed in earlier implementations of the TI to LC are considered. Special attention is given to the rotational Goldstone mode. The free-energy landscape in MC and SCF is directly compared. For moderate σ the differences highlight the importance of local non-bonded orientation correlations between segments, which SCF neglects. Simple renormalization of parameters in SCF cannot compensate the missing correlations. Increasing σ reduces correlations and SCF reproduces well the free energy in MC simulations.
2006-10-01
The objective was to construct a bridge between existing and future microscopic simulation codes ( kMC , MD, MC, BD, LB etc.) and traditional, continuum...kinetic Monte Carlo, kMC , equilibrium MC, Lattice-Boltzmann, LB, Brownian Dynamics, BD, or general agent-based, AB) simulators. It also, fortuitously...cond-mat/0310460 at arXiv.org. 27. Coarse Projective kMC Integration: Forward/Reverse Initial and Boundary Value Problems", R. Rico-Martinez, C. W
Villegas, Manuel; Huiliñir, Cesar
2014-12-01
This study focuses on the kinetics of the biodegradation of volatile solids (VS) of sewage sludge for biodrying under different initial moisture contents (Mc) and air-flow rates (AFR). For the study, a 3(2) factorial design, whose factors were AFR (1, 2 or 3L/minkgTS) and initial Mc (59%, 68% and 78% w.b.), was used. Using seven kinetic models and a nonlinear regression method, kinetic parameters were estimated and the models were analyzed with two statistical indicators. Initial Mc of around 68% increases the temperature matrix and VS consumption, with higher moisture removal at lower initial Mc values. Lower AFRs gave higher matrix temperatures and VS consumption, while higher AFRs increased water removal. The kinetic models proposed successfully simulate VS biodegradation, with root mean square error (RMSE) between 0.007929 and 0.02744, and they can be used as a tool for satisfactory prediction of VS in biodrying. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bieda, Bogusław; Grzesik, Katarzyna
2017-11-01
The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.
Bayesian inversion using a geologically realistic and discrete model space
NASA Astrophysics Data System (ADS)
Jaeggli, C.; Julien, S.; Renard, P.
2017-12-01
Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.
NASA Astrophysics Data System (ADS)
Schiavi, A.; Senzacqua, M.; Pioli, S.; Mairani, A.; Magro, G.; Molinelli, S.; Ciocca, M.; Battistoni, G.; Patera, V.
2017-09-01
Ion beam therapy is a rapidly growing technique for tumor radiation therapy. Ions allow for a high dose deposition in the tumor region, while sparing the surrounding healthy tissue. For this reason, the highest possible accuracy in the calculation of dose and its spatial distribution is required in treatment planning. On one hand, commonly used treatment planning software solutions adopt a simplified beam-body interaction model by remapping pre-calculated dose distributions into a 3D water-equivalent representation of the patient morphology. On the other hand, Monte Carlo (MC) simulations, which explicitly take into account all the details in the interaction of particles with human tissues, are considered to be the most reliable tool to address the complexity of mixed field irradiation in a heterogeneous environment. However, full MC calculations are not routinely used in clinical practice because they typically demand substantial computational resources. Therefore MC simulations are usually only used to check treatment plans for a restricted number of difficult cases. The advent of general-purpose programming GPU cards prompted the development of trimmed-down MC-based dose engines which can significantly reduce the time needed to recalculate a treatment plan with respect to standard MC codes in CPU hardware. In this work, we report on the development of fred, a new MC simulation platform for treatment planning in ion beam therapy. The code can transport particles through a 3D voxel grid using a class II MC algorithm. Both primary and secondary particles are tracked and their energy deposition is scored along the trajectory. Effective models for particle-medium interaction have been implemented, balancing accuracy in dose deposition with computational cost. Currently, the most refined module is the transport of proton beams in water: single pencil beam dose-depth distributions obtained with fred agree with those produced by standard MC codes within 1-2% of the Bragg peak in the therapeutic energy range. A comparison with measurements taken at the CNAO treatment center shows that the lateral dose tails are reproduced within 2% in the field size factor test up to 20 cm. The tracing kernel can run on GPU hardware, achieving 10 million primary s-1 on a single card. This performance allows one to recalculate a proton treatment plan at 1% of the total particles in just a few minutes.
Methods for Monte Carlo simulations of biomacromolecules
Vitalis, Andreas; Pappu, Rohit V.
2010-01-01
The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies. PMID:20428473
STARE CubeSat Communications Testing, Simulation and Analysis
2012-09-01
26 Figure 24. STK MC3 Ground Station Locations ................................................... 31 x THIS PAGE INTENTIONALLY LEFT BLANK xi...Refinement of Ephemeris STK Satellite Tool Kit VPN Virtual Private Network xiv THIS PAGE INTENTIONALLY LEFT BLANK xv ACKNOWLEDGMENTS This...the radio itself. Using a signal attenuator to decrease signal strength by 10 dB increments, and a spectrum analyzer to see a visual representation
Ling, Cheng; Hamada, Tsuyoshi; Gao, Jingyang; Zhao, Guoguang; Sun, Donghong; Shi, Weifeng
2016-01-01
MrBayes is a widespread phylogenetic inference tool harnessing empirical evolutionary models and Bayesian statistics. However, the computational cost on the likelihood estimation is very expensive, resulting in undesirably long execution time. Although a number of multi-threaded optimizations have been proposed to speed up MrBayes, there are bottlenecks that severely limit the GPU thread-level parallelism of likelihood estimations. This study proposes a high performance and resource-efficient method for GPU-oriented parallelization of likelihood estimations. Instead of having to rely on empirical programming, the proposed novel decomposition storage model implements high performance data transfers implicitly. In terms of performance improvement, a speedup factor of up to 178 can be achieved on the analysis of simulated datasets by four Tesla K40 cards. In comparison to the other publicly available GPU-oriented MrBayes, the tgMC 3 ++ method (proposed herein) outperforms the tgMC 3 (v1.0), nMC 3 (v2.1.1) and oMC 3 (v1.00) methods by speedup factors of up to 1.6, 1.9 and 2.9, respectively. Moreover, tgMC 3 ++ supports more evolutionary models and gamma categories, which previous GPU-oriented methods fail to take into analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chesneau, H; Lazaro, D; Blideanu, V
Purpose: The intensive use of Cone-Beam Computed Tomography (CBCT) during radiotherapy treatments raise some questions about the dose to healthy tissues delivered during image acquisitions. We hence developed a Monte Carlo (MC)-based tool to predict doses to organs delivered by the Elekta XVI kV-CBCT. This work aims at assessing the dosimetric accuracy of the MC tool, in all tissue types. Methods: The kV-CBCT MC model was developed using the PENELOPE code. The beam properties were validated against measured lateral and depth dose profiles in water, and energy spectra measured with a CdTe detector. The CBCT simulator accuracy then required verificationmore » in clinical conditions. For this, we compared calculated and experimental dose values obtained with OSL nanoDots and XRQA2 films inserted in CIRS anthropomorphic phantoms (male, female, and 5-year old child). Measurements were performed at different locations, including bone and lung structures, and for several acquisition protocols: lung, head-and-neck, and pelvis. OSLs and film measurements were corrected when possible for energy dependence, by taking into account for spectral variations between calibration and measurement conditions. Results: Comparisons between measured and MC dose values are summarized in table 1. A mean difference of 8.6% was achieved for OSLs when the energy correction was applied, and 89.3% of the 84 dose points were within uncertainty intervals, including those in bones and lungs. Results with XRQA2 are not as good, because incomplete information about electronic equilibrium in film layers hampered the application of a simple energy correction procedure. Furthermore, measured and calculated doses (Fig.1) are in agreement with the literature. Conclusion: The MC-based tool developed was validated with an extensive set of measurements, and enables the organ dose calculation with accuracy. It can now be used to compute and report doses to organs for clinical cases, and also to drive strategies to optimize imaging protocols.« less
TU-EF-304-03: 4D Monte Carlo Robustness Test for Proton Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Souris, K; Sterpin, E; Lee, J
Purpose: Breathing motion and approximate dose calculation engines may increase proton range uncertainties. We address these two issues using a comprehensive 4D robustness evaluation tool based on an efficient Monte Carlo (MC) engine, which can simulate breathing with no significant increase in computation time. Methods: To assess the robustness of the treatment plan, multiple scenarios of uncertainties are simulated, taking into account the systematic and random setup errors, range uncertainties, and organ motion. Our fast MC dose engine, called MCsquare, implements optimized models on a massively-parallel computation architecture and allows us to accurately simulate a scenario in less than onemore » minute. The deviations of the uncertainty scenarios are then reported on a DVH-band and compared to the nominal plan.The robustness evaluation tool is illustrated in a lung case by comparing three 60Gy treatment plans. First, a plan is optimized on a PTV obtained by extending the CTV with an 8mm margin, in order to take into account systematic geometrical uncertainties, like in our current practice in radiotherapy. No specific strategy is employed to correct for tumor and organ motions. The second plan involves a PTV generated from the ITV, which encompasses the tumor volume in all breathing phases. The last plan results from robust optimization performed on the ITV, with robustness parameters of 3% for tissue density and 8 mm for positioning errors. Results: The robustness test revealed that the first two plans could not properly cover the target in the presence of uncertainties. CTV-coverage (D95) in the three plans ranged respectively between 39.4–55.5Gy, 50.2–57.5Gy, and 55.1–58.6Gy. Conclusion: A realistic robustness verification tool based on a fast MC dose engine has been developed. This test is essential to assess the quality of proton therapy plan and very useful to study various planning strategies for mobile tumors. This work is partly funded by IBA (Louvain-la-Neuve, Belgium)« less
Design of a digital phantom population for myocardial perfusion SPECT imaging research.
Ghaly, Michael; Du, Yong; Fung, George S K; Tsui, Benjamin M W; Links, Jonathan M; Frey, Eric
2014-06-21
Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.
Design of a digital phantom population for myocardial perfusion SPECT imaging research
NASA Astrophysics Data System (ADS)
Ghaly, Michael; Du, Yong; Fung, George S. K.; Tsui, Benjamin M. W.; Links, Jonathan M.; Frey, Eric
2014-06-01
Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.
Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank
2018-02-01
Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Morikami, Kenji; Itezono, Yoshiko; Nishimoto, Masahiro; Ohta, Masateru
2014-01-01
Compounds with a medium-sized flexible ring often show atropisomerism that is caused by the high-energy barriers between long-lived conformers that can be isolated and often have different biological properties to each other. In this study, the frequency of the transition between the two stable conformers, aS and aR, of thienotriazolodiazepine compounds with flexible 7-membered rings was estimated computationally by Monte Carlo (MC) simulations and validated experimentally by NMR experiments. To estimate the energy barriers for transitions as precisely as possible, the potential energy (PE) surfaces used in the MC simulations were calculated by molecular orbital (MO) methods. To accomplish the MC simulations with the MO-based PE surfaces in a practical central processing unit (CPU) time, the MO-based PE of each conformer was pre-calculated and stored before the MC simulations, and then only referred to during the MC simulations. The activation energies for transitions calculated by the MC simulations agreed well with the experimental ΔG determined by the NMR experiments. The analysis of the transition trajectories of the MC simulations revealed that the transition occurred not only through the transition states, but also through many different transition paths. Our computational methods gave us quantitative estimates of atropisomerism of the thienotriazolodiazepine compounds in a practical period of time, and the method could be applicable for other slow-dynamics phenomena that cannot be investigated by other atomistic simulations.
Optimisation of 12 MeV electron beam simulation using variance reduction technique
NASA Astrophysics Data System (ADS)
Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul
2017-05-01
Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.
TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks
DOE Office of Scientific and Technical Information (OSTI.GOV)
French, S; Nazareth, D; Bellor, M
Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less
Monte Carlo simulation of particle-induced bit upsets
NASA Astrophysics Data System (ADS)
Wrobel, Frédéric; Touboul, Antoine; Vaillé, Jean-Roch; Boch, Jérôme; Saigné, Frédéric
2017-09-01
We investigate the issue of radiation-induced failures in electronic devices by developing a Monte Carlo tool called MC-Oracle. It is able to transport the particles in device, to calculate the energy deposited in the sensitive region of the device and to calculate the transient current induced by the primary particle and the secondary particles produced during nuclear reactions. We compare our simulation results with SRAM experiments irradiated with neutrons, protons and ions. The agreement is very good and shows that it is possible to predict the soft error rate (SER) for a given device in a given environment.
Predictive process simulation of cryogenic implants for leading edge transistor design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gossmann, Hans-Joachim; Zographos, Nikolas; Park, Hugh
2012-11-06
Two cryogenic implant TCAD-modules have been developed: (i) A continuum-based compact model targeted towards a TCAD production environment calibrated against an extensive data-set for all common dopants. Ion-specific calibration parameters related to damage generation and dynamic annealing were used and resulted in excellent fits to the calibration data-set. (ii) A Kinetic Monte Carlo (kMC) model including the full time dependence of ion-exposure that a particular spot on the wafer experiences, as well as the resulting temperature vs. time profile of this spot. It was calibrated by adjusting damage generation and dynamic annealing parameters. The kMC simulations clearly demonstrate the importancemore » of the time-structure of the beam for the amorphization process: Assuming an average dose-rate does not capture all of the physics and may lead to incorrect conclusions. The model enables optimization of the amorphization process through tool parameters such as scan speed or beam height.« less
NASA Astrophysics Data System (ADS)
Achtor, T. H.; Rink, T.
2010-12-01
The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and powerful tools. This presentation will describe the McIDAS-V software and demonstrate some of the capabilities of McIDAS-V to analyze and display many types of global data. The presentation will also focus on describing how McIDAS-V can be used as an educational window to examine global geophysical data. Consecutive polar orbiting passes of NASA MODIS and CALIPSO observations
Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.
Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle
2014-11-01
To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.
Interfacing MCNPX and McStas for simulation of neutron transport
NASA Astrophysics Data System (ADS)
Klinkby, Esben; Lauritzen, Bent; Nonbøl, Erik; Kjær Willendrup, Peter; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.
2013-02-01
Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX (Waters et al., 2007 [1]) or FLUKA (Battistoni et al., 2007; Ferrari et al., 2005 [2,3]) whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas (Lefmann and Nielsen, 1999; Willendrup et al., 2004, 2011a,b [4-7]). The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve such shortcomings includes the introduction of McStas inspired supermirrors in MCNPX. In the present paper different approaches to interface MCNPX and McStas are presented and applied to a simple test case. The direct coupling between MCNPX and McStas allows for more accurate simulations of e.g. complex moderator geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A, Popescu I; Lobo, J; Sawkey, D
2014-06-15
Purpose: To simulate and measure radiation backscattered into the monitor chamber of a TrueBeam linac; establish a rigorous framework for absolute dose calculations for TrueBeam Monte Carlo (MC) simulations through a novel approach, taking into account the backscattered radiation and the actual machine output during beam delivery; improve agreement between measured and simulated relative output factors. Methods: The ‘monitor backscatter factor’ is an essential ingredient of a well-established MC absolute dose formalism (the MC equivalent of the TG-51 protocol). This quantity was determined for the 6 MV, 6X FFF, and 10X FFF beams by two independent Methods: (1) MC simulationsmore » in the monitor chamber of the TrueBeam linac; (2) linac-generated beam record data for target current, logged for each beam delivery. Upper head MC simulations used a freelyavailable manufacturer-provided interface to a cloud-based platform, allowing use of the same head model as that used to generate the publicly-available TrueBeam phase spaces, without revealing the upper head design. The MC absolute dose formalism was expanded to allow direct use of target current data. Results: The relation between backscatter, number of electrons incident on the target for one monitor unit, and MC absolute dose was analyzed for open fields, as well as a jaw-tracking VMAT plan. The agreement between the two methods was better than 0.15%. It was demonstrated that the agreement between measured and simulated relative output factors improves across all field sizes when backscatter is taken into account. Conclusion: For the first time, simulated monitor chamber dose and measured target current for an actual TrueBeam linac were incorporated in the MC absolute dose formalism. In conjunction with the use of MC inputs generated from post-delivery trajectory-log files, the present method allows accurate MC dose calculations, without resorting to any of the simplifying assumptions previously made in the TrueBeam MC literature. This work has been partially funded by Varian Medical Systems.« less
NASA Astrophysics Data System (ADS)
Dal Molin, J. P.; Caliri, A.
2018-01-01
Here we focus on the conformational search for the native structure when it is ruled by the hydrophobic effect and steric specificities coming from amino acids. Our main tool of investigation is a 3D lattice model provided by a ten-letter alphabet, the stereochemical model. This minimalist model was conceived for Monte Carlo (MC) simulations when one keeps in mind the kinetic behavior of protein-like chains in solution. We have three central goals here. The first one is to characterize the folding time (τ) by two distinct sampling methods, so we present two sets of 103 MC simulations for a fast protein-like sequence. The resulting sets of characteristic folding times, τ and τq were obtained by the application of the standard Metropolis algorithm (MA), as well as by an enhanced algorithm (Mq A). The finding for τq shows two things: (i) the chain-solvent hydrophobic interactions {hk } plus a set of inter-residues steric constraints {ci,j } are able to emulate the conformational search for the native structure. For each one of the 103MC performed simulations, the target is always found within a finite time window; (ii) the ratio τq / τ ≅ 1 / 10 suggests that the effect of local thermal fluctuations, encompassed by the Tsallis weight, provides to the chain an innate efficiency to escape from energetic and steric traps. We performed additional MC simulations with variations of our design rule to attest this first result, both algorithms the MA and the Mq A were applied to a restricted set of targets, a physical insight is provided. Our second finding was obtained by a set of 600 independent MC simulations, only performed with the Mq A applied to an extended set of 200 representative targets, our native structures. The results show how structural patterns should modulate τq, which cover four orders of magnitude; this finding is our second goal. The third, and last result, was obtained with a special kind of simulation performed with the purpose to explore a possible connection between the hydrophobic component of protein stability and the native structural topology. We simulated those same 200 targets again with the Mq A, only. However, this time we evaluated the relative frequency {ϕq } in which each target visits its corresponding native structure along an appropriate simulation time. Due to the presence of the hydrophobic effect in our approach we obtained a strong correlation between the stability and the folding rate (R = 0 . 85). So, as faster a sequence found its target, as larger is the hydrophobic component of its stability. The strong correlation fulfills our last goal. This final finding suggests that the hydrophobic effect could not be a general stabilizing factor for proteins.
A Practical Tutorial on Modified Condition/Decision Coverage
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.
2001-01-01
This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Z; Gao, M
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster softwaremore » developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.« less
NASA Astrophysics Data System (ADS)
McGurk, Ross; Seco, Joao; Riboldi, Marco; Wolfgang, John; Segars, Paul; Paganetti, Harald
2010-03-01
The purpose of this work was to create a computational platform for studying motion in intensity modulated radiotherapy (IMRT). Specifically, the non-uniform rational B-spline (NURB) cardiac and torso (NCAT) phantom was modified for use in a four-dimensional Monte Carlo (4D-MC) simulation system to investigate the effect of respiratory-induced intra-fraction organ motion on IMRT dose distributions as a function of diaphragm motion, lesion size and lung density. Treatment plans for four clinical scenarios were designed: diaphragm peak-to-peak amplitude of 1 cm and 3 cm, and two lesion sizes—2 cm and 4 cm diameter placed in the lower lobe of the right lung. Lung density was changed for each phase using a conservation of mass calculation. Further, a new heterogeneous lung model was implemented and tested. Each lesion had an internal target volume (ITV) subsequently expanded by 15 mm isotropically to give the planning target volume (PTV). The PTV was prescribed to receive 72 Gy in 40 fractions. The MLC leaf sequence defined by the planning system for each patient was exported and used as input into the MC system. MC simulations using the dose planning method (DPM) code together with deformable image registration based on the NCAT deformation field were used to find a composite dose distribution for each phantom. These composite distributions were subsequently analyzed using information from the dose volume histograms (DVH). Lesion motion amplitude has the largest effect on the dose distribution. Tumor size was found to have a smaller effect and can be mitigated by ensuring the planning constraints are optimized for the tumor size. The use of a dynamic or heterogeneous lung density model over a respiratory cycle does not appear to be an important factor with a <= 0.6% change in the mean dose received by the ITV, PTV and right lung. The heterogeneous model increases the realism of the NCAT phantom and may provide more accurate simulations in radiation therapy investigations that use the phantom. This work further evaluates the NCAT phantom for use as a tool in radiation therapy research in addition to its extensive use in diagnostic imaging and nuclear medicine research. Our results indicate that the NCAT phantom, combined with 4D-MC simulations, is a useful tool in radiation therapy investigations and may allow the study of relative effects in many clinically relevant situations.
spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains
NASA Astrophysics Data System (ADS)
Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo
2016-09-01
The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, J; Micka, J; Culberson, W
Purpose: To determine the in-air azimuthal anisotropy and in-water dose distribution for the 1 cm length of the CivaString {sup 103}Pd brachytherapy source through measurements and Monte Carlo (MC) simulations. American Association of Physicists in Medicine Task Group No. 43 (TG-43) dosimetry parameters were also determined for this source. Methods: The in-air azimuthal anisotropy of the source was measured with a NaI scintillation detector and simulated with the MCNP5 radiation transport code. Measured and simulated results were normalized to their respective mean values and compared. The TG-43 dose-rate constant, line-source radial dose function, and 2D anisotropy function for this sourcemore » were determined from LiF:Mg,Ti thermoluminescent dosimeter (TLD) measurements and MC simulations. The impact of {sup 103}Pd well-loading variability on the in-water dose distribution was investigated using MC simulations by comparing the dose distribution for a source model with four wells of equal strength to that for a source model with strengths increased by 1% for two of the four wells. Results: NaI scintillation detector measurements and MC simulations of the in-air azimuthal anisotropy showed that ≥95% of the normalized data were within 1.2% of the mean value. TLD measurements and MC simulations of the TG-43 dose-rate constant, line-source radial dose function, and 2D anisotropy function agreed to within the experimental TLD uncertainties (k=2). MC simulations showed that a 1% variability in {sup 103}Pd well-loading resulted in changes of <0.1%, <0.1%, and <0.3% in the TG-43 dose-rate constant, radial dose distribution, and polar dose distribution, respectively. Conclusion: The CivaString source has a high degree of azimuthal symmetry as indicated by the NaI scintillation detector measurements and MC simulations of the in-air azimuthal anisotropy. TG-43 dosimetry parameters for this source were determined from TLD measurements and MC simulations. {sup 103}Pd well-loading variability results in minimal variations in the in-water dose distribution according to MC simulations. This work was partially supported by CivaTech Oncology, Inc. through an educational grant for Joshua Reed, John Micka, Wesley Culberson, and Larry DeWerd and through research support for Mark Rivard.« less
Qin, Nan; Botas, Pablo; Giantsoudi, Drosoula; Schuemann, Jan; Tian, Zhen; Jiang, Steve B.; Paganetti, Harald; Jia, Xun
2016-01-01
Monte Carlo (MC) simulation is commonly considered as the most accurate dose calculation method for proton therapy. Aiming at achieving fast MC dose calculations for clinical applications, we have previously developed a GPU-based MC tool, gPMC. In this paper, we report our recent updates on gPMC in terms of its accuracy, portability, and functionality, as well as comprehensive tests on this tool. The new version, gPMC v2.0, was developed under the OpenCL environment to enable portability across different computational platforms. Physics models of nuclear interactions were refined to improve calculation accuracy. Scoring functions of gPMC were expanded to enable tallying particle fluence, dose deposited by different particle types, and dose-averaged linear energy transfer (LETd). A multiple counter approach was employed to improve efficiency by reducing frequency of memory writing conflict at scoring. For dose calculation, accuracy improvements over gPMC v1.0 were observed in both water phantom cases and a patient case. For a prostate cancer case planned using high-energy proton beams, dose discrepancies in beam entrance and target region seen in gPMC v1.0 with respect to the gold standard tool for proton Monte Carlo simulations (TOPAS) results were substantially reduced and gamma test passing rate (1%/1mm) was improved from 82.7% to 93.1%. Average relative difference in LETd between gPMC and TOPAS was 1.7%. Average relative differences in dose deposited by primary, secondary, and other heavier particles were within 2.3%, 0.4%, and 0.2%. Depending on source proton energy and phantom complexity, it took 8 to 17 seconds on an AMD Radeon R9 290x GPU to simulate 107 source protons, achieving less than 1% average statistical uncertainty. As beam size was reduced from 10×10 cm2 to 1×1 cm2, time on scoring was only increased by 4.8% with eight counters, in contrast to a 40% increase using only one counter. With the OpenCL environment, the portability of gPMC v2.0 was enhanced. It was successfully executed on different CPUs and GPUs and its performance on different devices varied depending on processing power and hardware structure. PMID:27694712
2012-07-01
du monde de la modélisation et de la simulation et lui fournir des directives de mise en œuvre ; et fournir des ...définition ; rapports avec les normes ; spécification de procédure de gestion de la MC ; spécification d’artefact de MC. Considérations importantes...utilisant la présente directive comme référence. • Les VV&A (vérification, validation et acceptation) des MC doivent faire partie intégrante du
Monte Carlo simulations in Nuclear Medicine
NASA Astrophysics Data System (ADS)
Loudos, George K.
2007-11-01
Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.
Molecular dynamics and dynamic Monte-Carlo simulation of irradiation damage with focused ion beams
NASA Astrophysics Data System (ADS)
Ohya, Kaoru
2017-03-01
The focused ion beam (FIB) has become an important tool for micro- and nanostructuring of samples such as milling, deposition and imaging. However, this leads to damage of the surface on the nanometer scale from implanted projectile ions and recoiled material atoms. It is therefore important to investigate each kind of damage quantitatively. We present a dynamic Monte-Carlo (MC) simulation code to simulate the morphological and compositional changes of a multilayered sample under ion irradiation and a molecular dynamics (MD) simulation code to simulate dose-dependent changes in the backscattering-ion (BSI)/secondary-electron (SE) yields of a crystalline sample. Recent progress in the codes for research to simulate the surface morphology and Mo/Si layers intermixing in an EUV lithography mask irradiated with FIBs, and the crystalline orientation effect on BSI and SE yields relating to the channeling contrast in scanning ion microscopes, is also presented.
Taguchi, Katsuyuki; Polster, Christoph; Lee, Okkyun; Stierstorfer, Karl; Kappler, Steffen
2016-12-01
An x-ray photon interacts with photon counting detectors (PCDs) and generates an electron charge cloud or multiple clouds. The clouds (thus, the photon energy) may be split between two adjacent PCD pixels when the interaction occurs near pixel boundaries, producing a count at both of the pixels. This is called double-counting with charge sharing. (A photoelectric effect with K-shell fluorescence x-ray emission would result in double-counting as well). As a result, PCD data are spatially and energetically correlated, although the output of individual PCD pixels is Poisson distributed. Major problems include the lack of a detector noise model for the spatio-energetic cross talk and lack of a computationally efficient simulation tool for generating correlated Poisson data. A Monte Carlo (MC) simulation can accurately simulate these phenomena and produce noisy data; however, it is not computationally efficient. In this study, the authors developed a new detector model and implemented it in an efficient software simulator that uses a Poisson random number generator to produce correlated noisy integer counts. The detector model takes the following effects into account: (1) detection efficiency; (2) incomplete charge collection and ballistic effect; (3) interaction with PCDs via photoelectric effect (with or without K-shell fluorescence x-ray emission, which may escape from the PCDs or be reabsorbed); and (4) electronic noise. The correlation was modeled by using these two simplifying assumptions: energy conservation and mutual exclusiveness. The mutual exclusiveness is that no more than two pixels measure energy from one photon. The effect of model parameters has been studied and results were compared with MC simulations. The agreement, with respect to the spectrum, was evaluated using the reduced χ 2 statistics or a weighted sum of squared errors, χ red 2 (≥1), where χ red 2 =1 indicates a perfect fit. The model produced spectra with flat field irradiation that qualitatively agree with previous studies. The spectra generated with different model and geometry parameters allowed for understanding the effect of the parameters on the spectrum and the correlation of data. The agreement between the model and MC data was very strong. The mean spectra with 90 keV and 140 kVp agreed exceptionally well: χ red 2 values were 1.049 with 90 keV data and 1.007 with 140 kVp data. The degrees of cross talk (in terms of the relative increase from single pixel irradiation to flat field irradiation) were 22% with 90 keV and 19% with 140 kVp for MC simulations, while they were 21% and 17%, respectively, for the model. The covariance was in strong agreement qualitatively, although it was overestimated. The noisy data generation was very efficient, taking less than a CPU minute as opposed to CPU hours for MC simulators. The authors have developed a novel, computationally efficient PCD model that takes into account double-counting and resulting spatio-energetic correlation between PCD pixels. The MC simulation validated the accuracy.
A clinical study of lung cancer dose calculation accuracy with Monte Carlo simulation.
Zhao, Yanqun; Qi, Guohai; Yin, Gang; Wang, Xianliang; Wang, Pei; Li, Jian; Xiao, Mingyong; Li, Jie; Kang, Shengwei; Liao, Xiongfei
2014-12-16
The accuracy of dose calculation is crucial to the quality of treatment planning and, consequently, to the dose delivered to patients undergoing radiation therapy. Current general calculation algorithms such as Pencil Beam Convolution (PBC) and Collapsed Cone Convolution (CCC) have shortcomings in regard to severe inhomogeneities, particularly in those regions where charged particle equilibrium does not hold. The aim of this study was to evaluate the accuracy of the PBC and CCC algorithms in lung cancer radiotherapy using Monte Carlo (MC) technology. Four treatment plans were designed using Oncentra Masterplan TPS for each patient. Two intensity-modulated radiation therapy (IMRT) plans were developed using the PBC and CCC algorithms, and two three-dimensional conformal therapy (3DCRT) plans were developed using the PBC and CCC algorithms. The DICOM-RT files of the treatment plans were exported to the Monte Carlo system to recalculate. The dose distributions of GTV, PTV and ipsilateral lung calculated by the TPS and MC were compared. For 3DCRT and IMRT plans, the mean dose differences for GTV between the CCC and MC increased with decreasing of the GTV volume. For IMRT, the mean dose differences were found to be higher than that of 3DCRT. The CCC algorithm overestimated the GTV mean dose by approximately 3% for IMRT. For 3DCRT plans, when the volume of the GTV was greater than 100 cm(3), the mean doses calculated by CCC and MC almost have no difference. PBC shows large deviations from the MC algorithm. For the dose to the ipsilateral lung, the CCC algorithm overestimated the dose to the entire lung, and the PBC algorithm overestimated V20 but underestimated V5; the difference in V10 was not statistically significant. PBC substantially overestimates the dose to the tumour, but the CCC is similar to the MC simulation. It is recommended that the treatment plans for lung cancer be developed using an advanced dose calculation algorithm other than PBC. MC can accurately calculate the dose distribution in lung cancer and can provide a notably effective tool for benchmarking the performance of other dose calculation algorithms within patients.
Automated SEM and TEM sample preparation applied to copper/low k materials
NASA Astrophysics Data System (ADS)
Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.
2001-01-01
We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.
Binding, Thermodynamics, and Selectivity of a Non-peptide Antagonist to the Melanocortin-4 Receptor
Saleh, Noureldin; Kleinau, Gunnar; Heyder, Nicolas; Clark, Timothy; Hildebrand, Peter W.; Scheerer, Patrick
2018-01-01
The melanocortin-4 receptor (MC4R) is a potential drug target for treatment of obesity, anxiety, depression, and sexual dysfunction. Crystal structures for MC4R are not yet available, which has hindered successful structure-based drug design. Using microsecond-scale molecular-dynamics simulations, we have investigated selective binding of the non-peptide antagonist MCL0129 to a homology model of human MC4R (hMC4R). This approach revealed that, at the end of a multi-step binding process, MCL0129 spontaneously adopts a binding mode in which it blocks the agonistic-binding site. This binding mode was confirmed in subsequent metadynamics simulations, which gave an affinity for human hMC4R that matches the experimentally determined value. Extending our simulations of MCL0129 binding to hMC1R and hMC3R, we find that receptor subtype selectivity for hMC4R depends on few amino acids located in various structural elements of the receptor. These insights may support rational drug design targeting the melanocortin systems.
Ustinov, E A; Do, D D
2012-08-21
We present for the first time in the literature a new scheme of kinetic Monte Carlo method applied on a grand canonical ensemble, which we call hereafter GC-kMC. It was shown recently that the kinetic Monte Carlo (kMC) scheme is a very effective tool for the analysis of equilibrium systems. It had been applied in a canonical ensemble to describe vapor-liquid equilibrium of argon over a wide range of temperatures, gas adsorption on a graphite open surface and in graphitic slit pores. However, in spite of the conformity of canonical and grand canonical ensembles, the latter is more relevant in the correct description of open systems; for example, the hysteresis loop observed in adsorption of gases in pores under sub-critical conditions can only be described with a grand canonical ensemble. Therefore, the present paper is aimed at an extension of the kMC to open systems. The developed GC-kMC was proved to be consistent with the results obtained with the canonical kMC (C-kMC) for argon adsorption on a graphite surface at 77 K and in graphitic slit pores at 87.3 K. We showed that in slit micropores the hexagonal packing in the layers adjacent to the pore walls is observed at high loadings even at temperatures above the triple point of the bulk phase. The potential and applicability of the GC-kMC are further shown with the correct description of the heat of adsorption and the pressure tensor of the adsorbed phase.
NASA Astrophysics Data System (ADS)
Alexander, A.; DeBlois, F.; Stroian, G.; Al-Yahya, K.; Heath, E.; Seuntjens, J.
2007-07-01
Radiotherapy research lacks a flexible computational research environment for Monte Carlo (MC) and patient-specific treatment planning. The purpose of this study was to develop a flexible software package on low-cost hardware with the aim of integrating new patient-specific treatment planning with MC dose calculations suitable for large-scale prospective and retrospective treatment planning studies. We designed the software package 'McGill Monte Carlo treatment planning' (MMCTP) for the research development of MC and patient-specific treatment planning. The MMCTP design consists of a graphical user interface (GUI), which runs on a simple workstation connected through standard secure-shell protocol to a cluster for lengthy MC calculations. Treatment planning information (e.g., images, structures, beam geometry properties and dose distributions) is converted into a convenient MMCTP local file storage format designated, the McGill RT format. MMCTP features include (a) DICOM_RT, RTOG and CADPlan CART format imports; (b) 2D and 3D visualization views for images, structure contours, and dose distributions; (c) contouring tools; (d) DVH analysis, and dose matrix comparison tools; (e) external beam editing; (f) MC transport calculation from beam source to patient geometry for photon and electron beams. The MC input files, which are prepared from the beam geometry properties and patient information (e.g., images and structure contours), are uploaded and run on a cluster using shell commands controlled from the MMCTP GUI. The visualization, dose matrix operation and DVH tools offer extensive options for plan analysis and comparison between MC plans and plans imported from commercial treatment planning systems. The MMCTP GUI provides a flexible research platform for the development of patient-specific MC treatment planning for photon and electron external beam radiation therapy. The impact of this tool lies in the fact that it allows for systematic, platform-independent, large-scale MC treatment planning for different treatment sites. Patient recalculations were performed to validate the software and ensure proper functionality.
The DoE method as an efficient tool for modeling the behavior of monocrystalline Si-PV module
NASA Astrophysics Data System (ADS)
Kessaissia, Fatma Zohra; Zegaoui, Abdallah; Boutoubat, Mohamed; Allouache, Hadj; Aillerie, Michel; Charles, Jean-Pierre
2018-05-01
The objective of this paper is to apply the Design of Experiments (DoE) method to study and to obtain a predictive model of any marketed monocrystalline photovoltaic (mc-PV) module. This technique allows us to have a mathematical model that represents the predicted responses depending upon input factors and experimental data. Therefore, the DoE model for characterization and modeling of mc-PV module behavior can be obtained by just performing a set of experimental trials. The DoE model of the mc-PV panel evaluates the predictive maximum power, as a function of irradiation and temperature in a bounded domain of study for inputs. For the mc-PV panel, the predictive model for both one level and two levels were developed taking into account both influences of the main effect and the interactive effects on the considered factors. The DoE method is then implemented by developing a code under Matlab software. The code allows us to simulate, characterize, and validate the predictive model of the mc-PV panel. The calculated results were compared to the experimental data, errors were estimated, and an accurate validation of the predictive models was evaluated by the surface response. Finally, we conclude that the predictive models reproduce the experimental trials and are defined within a good accuracy.
Hoefling, Martin; Lima, Nicola; Haenni, Dominik; Seidel, Claus A. M.; Schuler, Benjamin; Grubmüller, Helmut
2011-01-01
Förster Resonance Energy Transfer (FRET) experiments probe molecular distances via distance dependent energy transfer from an excited donor dye to an acceptor dye. Single molecule experiments not only probe average distances, but also distance distributions or even fluctuations, and thus provide a powerful tool to study biomolecular structure and dynamics. However, the measured energy transfer efficiency depends not only on the distance between the dyes, but also on their mutual orientation, which is typically inaccessible to experiments. Thus, assumptions on the orientation distributions and averages are usually made, limiting the accuracy of the distance distributions extracted from FRET experiments. Here, we demonstrate that by combining single molecule FRET experiments with the mutual dye orientation statistics obtained from Molecular Dynamics (MD) simulations, improved estimates of distances and distributions are obtained. From the simulated time-dependent mutual orientations, FRET efficiencies are calculated and the full statistics of individual photon absorption, energy transfer, and photon emission events is obtained from subsequent Monte Carlo (MC) simulations of the FRET kinetics. All recorded emission events are collected to bursts from which efficiency distributions are calculated in close resemblance to the actual FRET experiment, taking shot noise fully into account. Using polyproline chains with attached Alexa 488 and Alexa 594 dyes as a test system, we demonstrate the feasibility of this approach by direct comparison to experimental data. We identified cis-isomers and different static local environments as sources of the experimentally observed heterogeneity. Reconstructions of distance distributions from experimental data at different levels of theory demonstrate how the respective underlying assumptions and approximations affect the obtained accuracy. Our results show that dye fluctuations obtained from MD simulations, combined with MC single photon kinetics, provide a versatile tool to improve the accuracy of distance distributions that can be extracted from measured single molecule FRET efficiencies. PMID:21629703
Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein
2016-01-01
The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.
NASA Astrophysics Data System (ADS)
Sawada, A.; Faniel, S.; Mineshige, S.; Kawabata, S.; Saito, K.; Kobayashi, K.; Sekine, Y.; Sugiyama, H.; Koga, T.
2018-05-01
We report an approach for examining electron properties using information about the shape and size of a nanostructure as a measurement reference. This approach quantifies the spin precession angles per unit length directly by considering the time-reversal interferences on chaotic return trajectories within mesoscopic ring arrays (MRAs). Experimentally, we fabricated MRAs using nanolithography in InGaAs quantum wells which had a gate-controllable spin-orbit interaction (SOI). As a result, we observed an Onsager symmetry related to relativistic magnetic fields, which provided us with indispensable information for the semiclassical billiard ball simulation. Our simulations, developed based on the real-space formalism of the weak localization/antilocalization effect including the degree of freedom for electronic spin, reproduced the experimental magnetoconductivity (MC) curves with high fidelity. The values of five distinct electron parameters (Fermi wavelength, spin precession angles per unit length for two different SOIs, impurity scattering length, and phase coherence length) were thereby extracted from a single MC curve. The methodology developed here is applicable to wide ranges of nanomaterials and devices, providing a diagnostic tool for exotic properties of two-dimensional electron systems.
2011-12-01
http://www.tribuneindia.com/2004/20041227/main1.htm Chang, S. E., McDaniels, T. L., Mikawoz, J., & Peterson , K. (2007). Infrastructure failure...University. Hind, P., Frost, M., & Rowley, S. ( 1996 ). The resilience audit and the psychological contract. Journal of Managerial Psychology, 11, 18-29...globalization by more globalization. Asian Perspective, 28, 19-44. Hurwitt, J. M., Bolotnick, T. J., Corsetti, B. A., Hershey , D. A., Hoffman, K. T
NASA Astrophysics Data System (ADS)
Santos, William S.; Neves, Lucio P.; Perini, Ana P.; Caldas, Linda V. E.; Maia, Ana F.
2015-12-01
Cerebral angiography exams may provide valuable diagnostic information for the patients with suspect of cerebral diseases, but it may also deliver high doses of radiation to the patients and medical staff. In order to evaluate the medical and occupational expositions from different irradiation conditions, Monte Carlo (MC) simulations were employed. Virtual anthropomorphic phantoms (MASH) were used to represent the patient and the physician inside a typical fluoroscopy room, also simulated in details, incorporated in the MCNPX 2.7.0 MC code. The evaluation was carried out by means of dose conversion coefficients (CCs) for equivalent (H) and effective (E) doses normalized by the air kerma-area product (KAP). The CCs for the surface entrance dose of the patient (ESD) and equivalent dose for the eyes of the medical staff were determined, because CA exams present higher risks for those organs. The tube voltage was 80 kVp, and Al filters with thicknesses of 2.5 mm, 3.5 mm and 4.0 mm were positioned in the beams. Two projections were simulated: posterior-anterior (PA) and right-lateral (RLAT). In all situations there was an increase of the CC values with the increase of the Al filtration. The highest dose was obtained for a RLAT projection with a 4.0 mm Al filter. In this projection, the ESD/KAP and E/KAP values to patient were 11 (14%) mGy/Gy cm2 and 0.12 (0.1%) mSv/Gy cm2, respectively. For the physician, the use of shield lead glass suspended and lead curtain attached to the surgical table resulted in a significant reduction of the CCs. The use of MC simulations proved to be a very important tool in radiation protection dosimetry, and specifically in this study several parameters could be evaluated, which would not be possible experimentally.
Efficiency in nonequilibrium molecular dynamics Monte Carlo simulations
Radak, Brian K.; Roux, Benoît
2016-10-07
Hybrid algorithms combining nonequilibrium molecular dynamics and Monte Carlo (neMD/MC) offer a powerful avenue for improving the sampling efficiency of computer simulations of complex systems. These neMD/MC algorithms are also increasingly finding use in applications where conventional approaches are impractical, such as constant-pH simulations with explicit solvent. However, selecting an optimal nonequilibrium protocol for maximum efficiency often represents a non-trivial challenge. This work evaluates the efficiency of a broad class of neMD/MC algorithms and protocols within the theoretical framework of linear response theory. The approximations are validated against constant pH-MD simulations and shown to provide accurate predictions of neMD/MC performance.more » An assessment of a large set of protocols confirms (both theoretically and empirically) that a linear work protocol gives the best neMD/MC performance. Lastly, a well-defined criterion for optimizing the time parameters of the protocol is proposed and demonstrated with an adaptive algorithm that improves the performance on-the-fly with minimal cost.« less
Integration of OpenMC methods into MAMMOTH and Serpent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerby, Leslie; DeHart, Mark; Tumulak, Aaron
OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.
Monte Carlo simulation of secondary electron images for gold nanorods on the silicon substrate
NASA Astrophysics Data System (ADS)
Zhang, P.
2018-06-01
Recently, gold nanorods (Au NRs) have attracted much attention because at a particular photoelectricity the gold nanorods present a characteristic which is different from other types of Au nanomaterials with various shapes. Accurate measurement of aspect ratios does provide very high value of optical property for Au NRs. Monte Carlo (MC) simulation is thought of as the most accurate tool to perform size measurement through extracting structure parameters from the simulated scanning electron microscopy (SEM) image which best matches the experimental one. In this article, a series of MC-simulated secondary electron (SE) images have been taken for Au NRs on a silicon substrate. However, it has already been observed that the two ends of Au NRs in the experimental SEM image is brighter than that of the middle part. It seriously affects the accuracy of size measurement for Au NRs. The purpose of this work is to understand the mechanism underlying this phenomenon through a series of systematical analysis. It was found that the cetyltrimethylammonium bromide (CTAB) which covers the Au NRs indeed can alter the contrast of Au NRs compared to that without CTAB covering. However, SEs emitting from CTAB are not the reason for the abnormal brightness at the two ends of NRs. This work reveals that the charging effect might be the leading cause for this phenomenon.
Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C
2014-01-01
Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakabe, D; Ohno, T; Araki, F
Purpose: The purpose of this study was to evaluate the combined organ dose of digital subtraction angiography (DSA) and computed tomography (CT) using a Monte Carlo (MC) simulation on the abdominal intervention. Methods: The organ doses for DSA and CT were obtained with MC simulation and actual measurements using fluorescent-glass dosimeters at 7 abdominal portions in an Alderson-Rando phantom. DSA was performed from three directions: posterior anterior (PA), right anterior oblique (RAO), and left anterior oblique (LAO). The organ dose with MC simulation was compared with actual radiation dose measurements. Calculations for the MC simulation were carried out with themore » GMctdospp (IMPS, Germany) software based on the EGSnrc MC code. Finally, the combined organ dose for DSA and CT was calculated from the MC simulation using the X-ray conditions of a patient with a diagnosis of hepatocellular carcinoma. Results: For DSA from the PA direction, the organ doses for the actual measurements and MC simulation were 2.2 and 2.4 mGy/100 mAs at the liver, respectively, and 3.0 and 3.1 mGy/100 mAs at the spinal cord, while for CT, the organ doses were 15.2 and 15.1 mGy/100 mAs at the liver, and 14.6 and 13.5 mGy/100 mAs at the spinal cord. The maximum difference in organ dose between the actual measurements and the MC simulation was 11.0% of the spleen at PA, 8.2% of the spinal cord at RAO, and 6.1% of left kidney at LAO with DSA and 9.3% of the stomach with CT. The combined organ dose (4 DSAs and 6 CT scans) with the use of actual patient conditions was found to be 197.4 mGy for the liver and 205.1 mGy for the spinal cord. Conclusion: Our method makes it possible to accurately assess the organ dose to patients for abdominal intervention with combined DSA and CT.« less
McIDAS-V: A Data Analysis and Visualization Tool for Global Satellite Data
NASA Astrophysics Data System (ADS)
Achtor, T. H.; Rink, T. D.
2011-12-01
The Man-computer Interactive Data Access System (McIDAS-V) is a java-based, open-source, freely available system for scientists, researchers and algorithm developers working with atmospheric data. The McIDAS-V software tools provide powerful new data manipulation and visualization capabilities, including 4-dimensional displays, an abstract data model with integrated metadata, user defined computation, and a powerful scripting capability. As such, McIDAS-V is a valuable tool for scientists and researchers within the GEO and GOESS domains. The advancing polar and geostationary orbit environmental satellite missions conducted by several countries will carry advanced instrumentation and systems that will collect and distribute land, ocean, and atmosphere data. These systems provide atmospheric and sea surface temperatures, humidity sounding, cloud and aerosol properties, and numerous other environmental products. This presentation will display and demonstrate some of the capabilities of McIDAS-V to analyze and display high temporal and spectral resolution data using examples from international environmental satellites.
NASA Astrophysics Data System (ADS)
Zhou, Abel; White, Graeme L.; Davidson, Rob
2018-02-01
Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.
Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein
2016-01-01
Aim The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. Background High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. Materials and methods The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. Results From designed door's thickness, the door designed by the MC simulation and Wu–McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Conclusion Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations. PMID:26900357
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mein, S; Gunasingha, R; Nolan, M
Purpose: X-PACT is an experimental cancer therapy where kV x-rays are used to photo-activate anti-cancer therapeutics through phosphor intermediaries (phosphors that absorb x-rays and re-radiate as UV light). Clinical trials in pet dogs are currently underway (NC State College of Veterinary Medicine) and an essential component is the ability to model the kV dose in these dogs. Here we report the commissioning and characterization of a Monte Carlo (MC) treatment planning simulation tool to calculate X-PACT radiation doses in canine trials. Methods: FLUKA multi-particle MC simulation package was used to simulate a standard X-PACT radiation treatment beam of 80kVp withmore » the Varian OBI x-ray source geometry. The beam quality was verified by comparing measured and simulated attenuation of the beam by various thicknesses of aluminum (2–4.6 mm) under narrow beam conditions (HVL). The beam parameters at commissioning were then corroborated using MC, characterized and verified with empirically collected commissioning data, including: percent depth dose curves (PDD), back-scatter factors (BSF), collimator scatter factor(s), and heel effect, etc. All simulations were conducted for N=30M histories at M=100 iterations. Results: HVL and PDD simulation data agreed with an average percent error of 2.42%±0.33 and 6.03%±1.58, respectively. The mean square error (MSE) values for HVL and PDD (0.07% and 0.50%) were low, as expected; however, longer simulations are required to validate convergence to the expected values. Qualitatively, pre- and post-filtration source spectra matched well with 80kVp references generated via SPEKTR software. Further validation of commissioning data simulation is underway in preparation for first-time 3D dose calculations with canine CBCT data. Conclusion: We have prepared a Monte Carlo simulation capable of accurate dose calculation for use with ongoing X-PACT canine clinical trials. Preliminary results show good agreement with measured data and hold promise for accurate quantification of dose for this novel psoralen X-ray therapy. Funding Support, Disclosures, & Conflict of Interest: The Monte Carlo simulation work was not funded; Drs. Adamson & Oldham have received funding from Immunolight LLC for X-PACT research.« less
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hoyoung; Korea Institute of Materials Science, 797 Changwon-daero, Seongsan-gu, Changwon, Gyeongnam 642-831; Kang, Jun-Yun, E-mail: firice@kims.re.kr
This study aimed to present the complete history of carbide evolution in a cold-work tool steel along its full processing route for fabrication and application. A sequence of processes from cast to final hardening heat treatment was conducted on an 8% Cr-steel to reproduce a typical commercial processing route in a small scale. The carbides found at each process step were then identified by electron diffraction with energy dispersive spectroscopy in a scanning or transmission electron microscope. After solidification, MC, M{sub 7}C{sub 3} and M{sub 2}C carbides were identified and the last one dissolved during hot compression at 1180 °C.more » In a subsequent annealing at 870 °C followed by slow cooling, M{sub 6}C and M{sub 23}C{sub 6} were added, while they were dissolved in the following austenitization at 1030 °C. After the final tempering at 520 °C, fine M{sub 23}C{sub 6} precipitated again, thus the final microstructure was the tempered martensite with MC, M{sub 7}C{sub 3} and M{sub 23}C{sub 6} carbide. The transient M{sub 2}C and M{sub 6}C originated from the segregation of Mo and finally disappeared due to attenuated segregation and the consequent thermodynamic instability. - Highlights: • The full processing route of a cold-work tool steel was simulated in a small scale. • The carbides in the tool steel were identified by chemical–crystallographic analyses. • MC, M{sub 7}C{sub 3}, M{sub 2}C, M{sub 6}C and M{sub 23}C{sub 6} carbides were found during the processing of the steel. • M{sub 2}C and M{sub 6}C finally disappeared due to thermodynamic instability.« less
Towards real-time photon Monte Carlo dose calculation in the cloud
NASA Astrophysics Data System (ADS)
Ziegenhein, Peter; Kozin, Igor N.; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-01
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Towards real-time photon Monte Carlo dose calculation in the cloud.
Ziegenhein, Peter; Kozin, Igor N; Kamerling, Cornelis Ph; Oelfke, Uwe
2017-06-07
Near real-time application of Monte Carlo (MC) dose calculation in clinic and research is hindered by the long computational runtimes of established software. Currently, fast MC software solutions are available utilising accelerators such as graphical processing units (GPUs) or clusters based on central processing units (CPUs). Both platforms are expensive in terms of purchase costs and maintenance and, in case of the GPU, provide only limited scalability. In this work we propose a cloud-based MC solution, which offers high scalability of accurate photon dose calculations. The MC simulations run on a private virtual supercomputer that is formed in the cloud. Computational resources can be provisioned dynamically at low cost without upfront investment in expensive hardware. A client-server software solution has been developed which controls the simulations and transports data to and from the cloud efficiently and securely. The client application integrates seamlessly into a treatment planning system. It runs the MC simulation workflow automatically and securely exchanges simulation data with the server side application that controls the virtual supercomputer. Advanced encryption standards were used to add an additional security layer, which encrypts and decrypts patient data on-the-fly at the processor register level. We could show that our cloud-based MC framework enables near real-time dose computation. It delivers excellent linear scaling for high-resolution datasets with absolute runtimes of 1.1 seconds to 10.9 seconds for simulating a clinical prostate and liver case up to 1% statistical uncertainty. The computation runtimes include the transportation of data to and from the cloud as well as process scheduling and synchronisation overhead. Cloud-based MC simulations offer a fast, affordable and easily accessible alternative for near real-time accurate dose calculations to currently used GPU or cluster solutions.
Simulation-based driver and vehicle crew training: applications, efficacy and future directions.
Goode, Natassia; Salmon, Paul M; Lenné, Michael G
2013-05-01
Simulation is widely used as a training tool in many domains, and more recently the use of vehicle simulation as a tool for driver and vehicle crew training has become popular (de Winter et al., 2009; Pradhan et al., 2009). This paper presents an overview of how vehicle simulations are currently used to train driving-related procedural and higher-order cognitive skills, and team-based procedural and non-technical teamwork skills for vehicle crews, and evaluates whether there is evidence these training programs are effective. Efficacy was evaluated in terms of whether training achieves learning objectives and whether the attainment of those objectives enhances real world performance on target tasks. It was concluded that while some higher-order cognitive skills training programs have been shown to be effective, in general the adoption of simulation technology has far outstripped the pace of empirical research in this area. The paper concludes with a discussion of the issues that require consideration when developing and evaluating vehicle simulations for training purposes - based not only on what is known from the vehicle domain, but what can be inferred from other domains in which simulation is an established training approach, such as aviation (e.g. Jentsch et al., 2011) and medicine (e.g. McGaghie et al., 2010). STATEMENT OF RELEVANCE: Simulation has become a popular tool for driver and vehicle crew training in civilian and military settings. This review considers whether there is evidence that this training method leads to learning and the transfer of skills to real world performance. Evidence from other domains, such as aviation and medicine, is drawn upon to inform the design and evaluation of future vehicle simulation training systems. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Veselsky, T; Novotny, J; Pastykova, V; Koniarova, I
2017-12-01
The aim of this study was to determine small field correction factors for a synthetic single-crystal diamond detector (PTW microDiamond) for routine use in clinical dosimetric measurements. Correction factors following small field Alfonso formalism were calculated by comparison of PTW microDiamond measured ratio M Qclin fclin /M Qmsr fmsr with Monte Carlo (MC) based field output factors Ω Qclin,Qmsr fclin,fmsr determined using Dosimetry Diode E or with MC simulation itself. Diode measurements were used for the CyberKnife and Varian Clinac 2100C/D linear accelerator. PTW microDiamond correction factors for Leksell Gamma Knife (LGK) were derived using MC simulated reference values from the manufacturer. PTW microDiamond correction factors for CyberKnife field sizes 25-5 mm were mostly smaller than 1% (except for 2.9% for 5 mm Iris field and 1.4% for 7.5 mm fixed cone field). The correction of 0.1% and 2.0% for 8 mm and 4 mm collimators, respectively, needed to be applied to PTW microDiamond measurements for LGK Perfexion. Finally, PTW microDiamond M Qclin fclin /M Qmsr fmsr for the linear accelerator varied from MC corrected Dosimetry Diode data by less than 0.5% (except for 1 × 1 cm 2 field size with 1.3% deviation). Regarding low resulting correction factor values, the PTW microDiamond detector may be considered an almost ideal tool for relative small field dosimetry in a large variety of stereotactic and radiosurgery treatment devices. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malkov, Victor N.; Rogers, David W.O.
The coupling of MRI and radiation treatment systems for the application of magnetic resonance guided radiation therapy necessitates a reliable magnetic field capable Monte Carlo (MC) code. In addition to the influence of the magnetic field on dose distributions, the question of proper calibration has arisen due to the several percent variation of ion chamber and solid state detector responses in magnetic fields when compared to the 0 T case (Reynolds et al., Med Phys, 2013). In the absence of a magnetic field, EGSnrc has been shown to pass the Fano cavity test (a rigorous benchmarking tool of MC codes)more » at the 0.1 % level (Kawrakow, Med.Phys, 2000), and similar results should be required of magnetic field capable MC algorithms. To properly test such developing MC codes, the Fano cavity theorem has been adapted to function in a magnetic field (Bouchard et al., PMB, 2015). In this work, the Fano cavity test is applied in a slab and ion-chamber-like geometries to test the transport options of an implemented magnetic field algorithm in EGSnrc. Results show that the deviation of the MC dose from the expected Fano cavity theory value is highly sensitive to the choice of geometry, and the ion chamber geometry appears to pass the test more easily than larger slab geometries. As magnetic field MC codes begin to be used for dose simulations and correction factor calculations, care must be taken to apply the most rigorous Fano test geometries to ensure reliability of such algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Pirlepesov, F; Tsiamas, P
Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less
NASA Astrophysics Data System (ADS)
Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann
2009-02-01
Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.
MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pater, P; Vallieres, M; Seuntjens, J
2014-06-15
Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dosemore » deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less
Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V
2007-12-01
The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.
The McDonaldization of Higher Education.
ERIC Educational Resources Information Center
Hayes, Dennis, Ed.; Wynyard, Robin, Ed.
The essays in this collection discuss the future of the university in the context of the "McDonaldization" of society and of academia. The idea of McDonaldization, a term coined by G. Ritzer (1998), provides a tool for looking at the university and its inevitable changes. The chapters are: (1) "Enchanting McUniversity: Toward a…
Quality Surveillance Project No. 38-- Mk 9 MC-97 and MC-98 Fuze Setting-Torque Test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandia Corporation Surveillance Department
1954-08-01
To determine the effects of stockpile storage on the torque required to set the functioning time on the MC-97 (Mechanical T-215 Time Fuze) and the MC-98 Mechanical T-220 Time Fuze). There were reports that the tool for setting the fuze time was breaking.
Tools for the Conceptual Design and Engineering Analysis of Micro Air Vehicles
2009-03-01
problem with two DC motors with propellers, mounted on each wing tip and oriented such that the thrust vectors had an angular separation of 180...ElectriCalc or MotoCalc Database • Script Program (MC) In determination of the components to be integrated into MC, the R/C world was explored since the tools...Excel, ProE, QuickWrap and Script . Importing outside applications can be achieved by direct interaction with MC or through analysis server connections [11
NASA Technical Reports Server (NTRS)
Khambatta, Cyrus F.
2007-01-01
A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.
A virtual source model for Monte Carlo simulation of helical tomotherapy.
Yuan, Jiankui; Rong, Yi; Chen, Quan
2015-01-08
The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase-space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS-generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of < 1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of < 2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM-based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose-volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM-based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media.
A virtual source model for Monte Carlo simulation of helical tomotherapy
Yuan, Jiankui; Rong, Yi
2015-01-01
The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase‐space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS‐generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of <1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of <2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM‐based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose‐volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent with literature. The VSM‐based MC simulation approach can be feasibly built from the gold standard beam model of a tomotherapy unit. The accuracy of the VSM was validated against measurements in homogeneous media, as well as published full MC model in heterogeneous media. PACS numbers: 87.53.‐j, 87.55.K‐ PMID:25679157
FF12MC: A revised AMBER forcefield and new protein simulation protocol
2016-01-01
ABSTRACT Specialized to simulate proteins in molecular dynamics (MD) simulations with explicit solvation, FF12MC is a combination of a new protein simulation protocol employing uniformly reduced atomic masses by tenfold and a revised AMBER forcefield FF99 with (i) shortened C—H bonds, (ii) removal of torsions involving a nonperipheral sp3 atom, and (iii) reduced 1–4 interaction scaling factors of torsions ϕ and ψ. This article reports that in multiple, distinct, independent, unrestricted, unbiased, isobaric–isothermal, and classical MD simulations FF12MC can (i) simulate the experimentally observed flipping between left‐ and right‐handed configurations for C14–C38 of BPTI in solution, (ii) autonomously fold chignolin, CLN025, and Trp‐cage with folding times that agree with the experimental values, (iii) simulate subsequent unfolding and refolding of these miniproteins, and (iv) achieve a robust Z score of 1.33 for refining protein models TMR01, TMR04, and TMR07. By comparison, the latest general‐purpose AMBER forcefield FF14SB locks the C14–C38 bond to the right‐handed configuration in solution under the same protein simulation conditions. Statistical survival analysis shows that FF12MC folds chignolin and CLN025 in isobaric–isothermal MD simulations 2–4 times faster than FF14SB under the same protein simulation conditions. These results suggest that FF12MC may be used for protein simulations to study kinetics and thermodynamics of miniprotein folding as well as protein structure and dynamics. Proteins 2016; 84:1490–1516. © 2016 The Authors Proteins: Structure, Function, and Bioinformatics Published by Wiley Periodicals, Inc. PMID:27348292
Zhao, Chao; Li, Dawei; Feng, Chuanping; Zhang, Zhenya; Sugiura, Norio; Yang, Yingnan
2015-01-01
A series of advanced WO3-based photocatalysts including CuO/WO3, Pd/WO3, and Pt/WO3 were synthesized for the photocatalytic removal of microcystin-LR (MC-LR) under simulated solar light. In the present study, Pt/WO3 exhibited the best performance for the photocatalytic degradation of MC-LR. The MC-LR degradation can be described by pseudo-first-order kinetic model. Chloride ion (Cl−) with proper concentration could enhance the MC-LR degradation. The presence of metal cations (Cu2+ and Fe3+) improved the photocatalytic degradation of MC-LR. This study suggests that Pt/WO3 photocatalytic oxidation under solar light is a promising option for the purification of water containing MC-LR. PMID:25884038
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreopoulos, Costas; Barry, Christopher; Dytman, Steve
2015-10-20
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less
NASA Technical Reports Server (NTRS)
Sud, Y. C.; Walker, G. K.
1998-01-01
A prognostic cloud scheme named McRAS (Microphysics of clouds with Relaxed Arakawa-Schubert Scheme) was developed with the aim of improving cloud-microphysics, and cloud-radiation interactions in GCMs. McRAS distinguishes convective, stratiform, and boundary-layer clouds. The convective clouds merge into stratiform clouds on an hourly time-scale, while the boundary-layer clouds do so instantly. The cloud condensate transforms into precipitation following the auto-conversion relations of Sundqvist that contain a parametric adaptation for the Bergeron-Findeisen process of ice crystal growth and collection of cloud condensate by precipitation. All clouds convect, advect, and diffuse both horizontally and vertically with a fully active cloud-microphysics throughout its life-cycle, while the optical properties of clouds are derived from the statistical distribution of hydrometeors and idealized cloud geometry. An evaluation of McRAS in a single column model (SCM) with the GATE Phase III data has shown that McRAS can simulate the observed temperature, humidity, and precipitation without discernible systematic errors. An evaluation with the ARM-CART SCM data in a cloud model intercomparison exercise shows reasonable but not an outstanding accurate simulation. Such a discrepancy is common to almost all models and is related, in part, to the input data quality. McRAS was implemented in the GEOS II GCM. A 50 month integration that was initialized with the ECMWF analysis of observations for January 1, 1987 and forced with the observed sea-surface temperatures and sea-ice distribution and vegetation properties (biomes, and soils), with prognostic soil moisture, snow-cover, and hydrology showed a very realistic simulation of cloud process, incloud water and ice, and cloud-radiative forcing (CRF). The simulated ITCZ showed a realistic time-mean structure and seasonal cycle, while the simulated CRF showed sensitivity to vertical distribution of cloud water which can be easily altered by the choice of time constant and incloud critical cloud water amount regulators for auto-conversion. The CRF and its feedbacks also have a profound effect on the ITCZ. Even though somewhat weaker than observed, the McRAS-GCM simulation produces robust 30-60 day oscillations in the 200 hPa velocity potential. Two ensembles of 4-summer (July, August, September) simulations, one each for 1987 and 1988 show that the McRAS-GCM simulates realistic and statistically significant precipitation differences over India, Central America, and tropical Africa. Several seasonal simulations were performed with McRAS-GEOS II GCM for the summer (June-July- August) and winter (December-January-February) periods to determine how the simulated clouds and CRFs would be affected by: i) advection of clouds; ii) cloud top entrainment instability, iii) cloud water inhomogeneity correction, and (iv) cloud production and dissipation in different cloud-processes. The results show that each of these processes contributes to the simulated cloud-fraction and CRF.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abouelnasr, MKF; Smit, B
2012-01-01
The self- and collective-diffusion behaviors of adsorbed methane, helium, and isobutane in zeolite frameworks LTA, MFI, AFI, and SAS were examined at various concentrations using a range of molecular simulation techniques including Molecular Dynamics (MD), Monte Carlo (MC), Bennett-Chandler (BC), and kinetic Monte Carlo (kMC). This paper has three main results. (1) A novel model for the process of adsorbate movement between two large cages was created, allowing the formulation of a mixing rule for the re-crossing coefficient between two cages of unequal loading. The predictions from this mixing rule were found to agree quantitatively with explicit simulations. (2) Amore » new approach to the dynamically corrected Transition State Theory method to analytically calculate self-diffusion properties was developed, explicitly accounting for nanoscale fluctuations in concentration. This approach was demonstrated to quantitatively agree with previous methods, but is uniquely suited to be adapted to a kMC simulation that can simulate the collective-diffusion behavior. (3) While at low and moderate loadings the self- and collective-diffusion behaviors in LTA are observed to coincide, at higher concentrations they diverge. A change in the adsorbate packing scheme was shown to cause this divergence, a trait which is replicated in a kMC simulation that explicitly models this behavior. These phenomena were further investigated for isobutane in zeolite MFI, where MD results showed a separation in self- and collective-diffusion behavior that was reproduced with kMC simulations.« less
Abouelnasr, Mahmoud K F; Smit, Berend
2012-09-07
The self- and collective-diffusion behaviors of adsorbed methane, helium, and isobutane in zeolite frameworks LTA, MFI, AFI, and SAS were examined at various concentrations using a range of molecular simulation techniques including Molecular Dynamics (MD), Monte Carlo (MC), Bennett-Chandler (BC), and kinetic Monte Carlo (kMC). This paper has three main results. (1) A novel model for the process of adsorbate movement between two large cages was created, allowing the formulation of a mixing rule for the re-crossing coefficient between two cages of unequal loading. The predictions from this mixing rule were found to agree quantitatively with explicit simulations. (2) A new approach to the dynamically corrected Transition State Theory method to analytically calculate self-diffusion properties was developed, explicitly accounting for nanoscale fluctuations in concentration. This approach was demonstrated to quantitatively agree with previous methods, but is uniquely suited to be adapted to a kMC simulation that can simulate the collective-diffusion behavior. (3) While at low and moderate loadings the self- and collective-diffusion behaviors in LTA are observed to coincide, at higher concentrations they diverge. A change in the adsorbate packing scheme was shown to cause this divergence, a trait which is replicated in a kMC simulation that explicitly models this behavior. These phenomena were further investigated for isobutane in zeolite MFI, where MD results showed a separation in self- and collective- diffusion behavior that was reproduced with kMC simulations.
Canopy polarized BRDF simulation based on non-stationary Monte Carlo 3-D vector RT modeling
NASA Astrophysics Data System (ADS)
Kallel, Abdelaziz; Gastellu-Etchegorry, Jean Philippe
2017-03-01
Vector radiative transfer (VRT) has been largely used to simulate polarized reflectance of atmosphere and ocean. However it is still not properly used to describe vegetation cover polarized reflectance. In this study, we try to propose a 3-D VRT model based on a modified Monte Carlo (MC) forward ray tracing simulation to analyze vegetation canopy reflectance. Two kinds of leaf scattering are taken into account: (i) Lambertian diffuse reflectance and transmittance and (ii) specular reflection. A new method to estimate the condition on leaf orientation to produce reflection is proposed, and its probability to occur, Pl,max, is computed. It is then shown that Pl,max is low, but when reflection happens, the corresponding radiance Stokes vector, Io, is very high. Such a phenomenon dramatically increases the MC variance and yields to an irregular reflectance distribution function. For better regularization, we propose a non-stationary MC approach that simulates reflection for each sunny leaf assuming that its orientation is randomly chosen according to its angular distribution. It is shown in this case that the average canopy reflection is proportional to Pl,max ·Io which produces a smooth distribution. Two experiments are conducted: (i) assuming leaf light polarization is only due to the Fresnel reflection and (ii) the general polarization case. In the former experiment, our results confirm that in the forward direction, canopy polarizes horizontally light. In addition, they show that in inclined forward direction, diagonal polarization can be observed. In the latter experiment, polarization is produced in all orientations. It is particularly pointed out that specular polarization explains just a part of the forward polarization. Diffuse scattering polarizes light horizontally and vertically in forward and backward directions, respectively. Weak circular polarization signal is also observed near the backscattering direction. Finally, validation of the non-polarized reflectance using the ROMC tool is done, and our model shows good agreement with the ROMC reference.
Comparison of Fluka-2006 Monte Carlo Simulation and Flight Data for the ATIC Detector
NASA Technical Reports Server (NTRS)
Gunasingha, R.M.; Fazely, A.R.; Adams, J.H.; Ahn, H.S.; Bashindzhagyan, G.L.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T.G.; Isbert, J.;
2007-01-01
We have performed a detailed Monte Carlo (MC) simulation for the Advanced Thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2006 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon flight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate(BGO) calorimeter. It is equipped with a large mosaic of.silicon detector pixels capable of charge identification, and, for particle tracking, three projective layers of x-y scintillator hodoscopes, located above, in the middle and below a 0.75 nuclear interaction length graphite target. Our simulations are part of an analysis package of both nuclear (A) and energy dependences for different nuclei interacting in the ATIC detector. The MC simulates the response of different components of the detector such as the Si-matrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We present comparisons of the FLUKA-2006 MC calculations with GEANT calculations and with the ATIC CERN data and ATIC flight data.
Song, Sangha; Elgezua, Inko; Kobayashi, Yo; Fujie, Masakatsu G
2013-01-01
In biomedical, Monte-carlo simulation is commonly used for simulation of light diffusion in tissue. But, most of previous studies did not consider a radial beam LED as light source. Therefore, we considered characteristics of a radial beam LED and applied them on MC simulation as light source. In this paper, we consider 3 characteristics of radial beam LED. The first is an initial launch area of photons. The second is an incident angle of a photon at an initial photon launching area. The third is the refraction effect according to contact area between LED and a turbid medium. For the verification of the MC simulation, we compared simulation and experimental results. The average of the correlation coefficient between simulation and experimental results is 0.9954. Through this study, we show an effective method to simulate light diffusion on tissue with characteristics for radial beam LED based on MC simulation.
Metal-Insulator-Metal Diode Process Development for Energy Harvesting Applications
2010-04-01
Sputter Tool Dep Method: Sputtering (DC Magnetron ) Recipe: MC_Pt 1640A_TiO2 1000A_Ti 2000A_500C_1a MC_Pt 1640A_TiO2 1000A_Ti 2000A_300C_1a MC_Pt...thin films were sputtered onto silicon substrates with silicon dioxide overlayers. I-V measurements were taken using an electrical characterization...deposition of the entire MIM material stack to be done without breaking the vacuum within a multi-material system DC sputtering tool. A CAD layout of a MIM
ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.
Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L
2018-05-01
In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.
Characterization of protein-folding pathways by reduced-space modeling.
Kmiecik, Sebastian; Kolinski, Andrzej
2007-07-24
Ab initio simulations of the folding pathways are currently limited to very small proteins. For larger proteins, some approximations or simplifications in protein models need to be introduced. Protein folding and unfolding are among the basic processes in the cell and are very difficult to characterize in detail by experiment or simulation. Chymotrypsin inhibitor 2 (CI2) and barnase are probably the best characterized experimentally in this respect. For these model systems, initial folding stages were simulated by using CA-CB-side chain (CABS), a reduced-space protein-modeling tool. CABS employs knowledge-based potentials that proved to be very successful in protein structure prediction. With the use of isothermal Monte Carlo (MC) dynamics, initiation sites with a residual structure and weak tertiary interactions were identified. Such structures are essential for the initiation of the folding process through a sequential reduction of the protein conformational space, overcoming the Levinthal paradox in this manner. Furthermore, nucleation sites that initiate a tertiary interactions network were located. The MC simulations correspond perfectly to the results of experimental and theoretical research and bring insights into CI2 folding mechanism: unambiguous sequence of folding events was reported as well as cooperative substructures compatible with those obtained in recent molecular dynamics unfolding studies. The correspondence between the simulation and experiment shows that knowledge-based potentials are not only useful in protein structure predictions but are also capable of reproducing the folding pathways. Thus, the results of this work significantly extend the applicability range of reduced models in the theoretical study of proteins.
Cho, Nathan; Tsiamas, Panagiotis; Velarde, Esteban; Tryggestad, Erik; Jacques, Robert; Berbeco, Ross; McNutt, Todd; Kazanzides, Peter; Wong, John
2018-05-01
The Small Animal Radiation Research Platform (SARRP) has been developed for conformal microirradiation with on-board cone beam CT (CBCT) guidance. The graphics processing unit (GPU)-accelerated Superposition-Convolution (SC) method for dose computation has been integrated into the treatment planning system (TPS) for SARRP. This paper describes the validation of the SC method for the kilovoltage energy by comparing with EBT2 film measurements and Monte Carlo (MC) simulations. MC data were simulated by EGSnrc code with 3 × 10 8 -1.5 × 10 9 histories, while 21 photon energy bins were used to model the 220 kVp x-rays in the SC method. Various types of phantoms including plastic water, cork, graphite, and aluminum were used to encompass the range of densities of mouse organs. For the comparison, percentage depth dose (PDD) of SC, MC, and film measurements were analyzed. Cross beam (x,y) dosimetric profiles of SC and film measurements are also presented. Correction factors (CFz) to convert SC to MC dose-to-medium are derived from the SC and MC simulations in homogeneous phantoms of aluminum and graphite to improve the estimation. The SC method produces dose values that are within 5% of film measurements and MC simulations in the flat regions of the profile. The dose is less accurate at the edges, due to factors such as geometric uncertainties of film placement and difference in dose calculation grids. The GPU-accelerated Superposition-Convolution dose computation method was successfully validated with EBT2 film measurements and MC calculations. The SC method offers much faster computation speed than MC and provides calculations of both dose-to-water in medium and dose-to-medium in medium. © 2018 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
Atomistic Monte Carlo Simulation of Lipid Membranes
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC) local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA) for the phospholipid dipalmitoylphosphatidylcholine (DPPC). We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol. PMID:24469314
Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS
NASA Astrophysics Data System (ADS)
Klinkby, E. B.; Knudsen, E. B.; Willendrup, P. K.; Lauritzen, B.; Nonbøl, E.; Bentley, P.; Filges, U.
2014-07-01
Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed [1, 2]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides. The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the neutron beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide, and by using newly developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates in the vicinity of the guide. In addition the logging mechanism is employed to record the scatterings along the guides which is exploited to simulate the supermirror quality requirements (i.e. m-values) needed at different positions along the beam guide to transport neutrons in the same guide/source setup.
Lu, Zeqin; Jhoja, Jaspreet; Klein, Jackson; Wang, Xu; Liu, Amy; Flueckiger, Jonas; Pond, James; Chrostowski, Lukas
2017-05-01
This work develops an enhanced Monte Carlo (MC) simulation methodology to predict the impacts of layout-dependent correlated manufacturing variations on the performance of photonics integrated circuits (PICs). First, to enable such performance prediction, we demonstrate a simple method with sub-nanometer accuracy to characterize photonics manufacturing variations, where the width and height for a fabricated waveguide can be extracted from the spectral response of a racetrack resonator. By measuring the spectral responses for a large number of identical resonators spread over a wafer, statistical results for the variations of waveguide width and height can be obtained. Second, we develop models for the layout-dependent enhanced MC simulation. Our models use netlist extraction to transfer physical layouts into circuit simulators. Spatially correlated physical variations across the PICs are simulated on a discrete grid and are mapped to each circuit component, so that the performance for each component can be updated according to its obtained variations, and therefore, circuit simulations take the correlated variations between components into account. The simulation flow and theoretical models for our layout-dependent enhanced MC simulation are detailed in this paper. As examples, several ring-resonator filter circuits are studied using the developed enhanced MC simulation, and statistical results from the simulations can predict both common-mode and differential-mode variations of the circuit performance.
OpenMC In Situ Source Convergence Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldrich, Garrett Allen; Dutta, Soumya; Woodring, Jonathan Lee
2016-05-07
We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are ablemore » to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.« less
Calculated X-ray Intensities Using Monte Carlo Algorithms: A Comparison to Experimental EPMA Data
NASA Technical Reports Server (NTRS)
Carpenter, P. K.
2005-01-01
Monte Carlo (MC) modeling has been used extensively to simulate electron scattering and x-ray emission from complex geometries. Here are presented comparisons between MC results and experimental electron-probe microanalysis (EPMA) measurements as well as phi(rhoz) correction algorithms. Experimental EPMA measurements made on NIST SRM 481 (AgAu) and 482 (CuAu) alloys, at a range of accelerating potential and instrument take-off angles, represent a formal microanalysis data set that has been widely used to develop phi(rhoz) correction algorithms. X-ray intensity data produced by MC simulations represents an independent test of both experimental and phi(rhoz) correction algorithms. The alpha-factor method has previously been used to evaluate systematic errors in the analysis of semiconductor and silicate minerals, and is used here to compare the accuracy of experimental and MC-calculated x-ray data. X-ray intensities calculated by MC are used to generate a-factors using the certificated compositions in the CuAu binary relative to pure Cu and Au standards. MC simulations are obtained using the NIST, WinCasino, and WinXray algorithms; derived x-ray intensities have a built-in atomic number correction, and are further corrected for absorption and characteristic fluorescence using the PAP phi(rhoz) correction algorithm. The Penelope code additionally simulates both characteristic and continuum x-ray fluorescence and thus requires no further correction for use in calculating alpha-factors.
NASA Astrophysics Data System (ADS)
Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei
2018-02-01
The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.
NASA Astrophysics Data System (ADS)
Keller, Tobias; Katz, Richard F.
2015-04-01
Laboratory experiments indicate that even small concentrations volatiles (H2O or CO2) in the upper mantle significantly affect the silicate melting behavior [HK96,DH06]. The presence of volatiles stabilizes volatile-rich melt at high pressure, thus vastly increasing the volume of the upper mantle expected to be partially molten [H10,DH10]. These small-degree melts have important consequences for chemical differentiation and could affect the dynamics of mantle flow. We have developed theory and numerical implementation to simulate thermo-chemically coupled magma/mantle dynamics in terms of a two-phase (rock+melt), three component (dunite+MORB+volatilized MORB) physical model. The fluid dynamics is based on McKenzie's equations [McK84], while the thermo-chemical formulation of the system is represented by a novel disequilibrium multi-component melting model based on thermo-dynamic theory [RBS11]. This physical model is implemented as a parallel, two-dimensional, finite-volume code that leverages tools from the PETSc toolkit. Application of this simulation code to a mid-ocean ridge system suggests that the methodology captures the leading-order features of both hydrated and carbonated mantle melting, including deep, low-degree, volatile-rich melt formation. Melt segregation leads to continuous dynamic thermo-chemical dis-equilibration, while phenomenological reaction rates are applied to continually move the system towards re-equilibration. The simulations will be used first to characterize volatile extraction from the MOR system assuming a chemically homogeneous mantle. Subsequently, simulations will be extended to investigate the consequences of heterogeneity in lithology [KW12] and volatile content. These studies will advance our understanding of the role of volatiles in the dynamic and chemical evolution of the upper mantle. Moreover, they will help to gauge the significance of the coupling between the deep carbon cycle and the ocean/atmosphere system. REFERENCES HK96 Hirth & Kohlstedt (1996), Earth Planet Sci Lett DH06 Dasgupta & Hirschmann (2006), doi:10.1038/nature04612. H10 Hirschmann (2010), doi:10.1016/j.pepi.2009.12.003. DH10 Dasgupta & Hirschmann (2010), doi:10.1016/j.epsl.2010.06.039. McK84 McKenzie (1984), J Pet KW12 Katz & Weatherley (2012), doi: 10.1016/j.epsl.2012.04.042. RBS11 Rudge, Bercovici & Spiegelman (2011), doi: 10.1111/j.1365-246X.2010.04870.x
NASA Astrophysics Data System (ADS)
Patwari, Puneet; Choudhury, Subhrojyoti R.; Banerjee, Amar; Swaminathan, N.; Pandey, Shreya
2016-07-01
Model Driven Engineering (MDE) as a key driver to reduce development cost of M&C systems is beginning to find acceptance across scientific instruments such as Radio Telescopes and Nuclear Reactors. Such projects are adopting it to reduce time to integrate, test and simulate their individual controllers and increase reusability and traceability in the process. The creation and maintenance of models is still a significant challenge to realizing MDE benefits. Creating domain-specific modelling environments reduces the barriers, and we have been working along these lines, creating a domain-specific language and environment based on an M&C knowledge model. However, large projects involve several such domains, and there is still a need to interconnect the domain models, in order to ensure modelling completeness. This paper presents a knowledge-centric approach to doing that, by creating a generic system model that underlies the individual domain knowledge models. We present our vision for M&C Domain Map Maker, a set of processes and tools that enables explication of domain knowledge in terms of domain models with mutual consistency relationships to aid MDE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazalova-Carter, Magdalena; Liu, Michael; Palma, Bianey
2015-04-15
Purpose: To measure radiation dose in a water-equivalent medium from very high-energy electron (VHEE) beams and make comparisons to Monte Carlo (MC) simulation results. Methods: Dose in a polystyrene phantom delivered by an experimental VHEE beam line was measured with Gafchromic films for three 50 MeV and two 70 MeV Gaussian beams of 4.0–6.9 mm FWHM and compared to corresponding MC-simulated dose distributions. MC dose in the polystyrene phantom was calculated with the EGSnrc/BEAMnrc and DOSXYZnrc codes based on the experimental setup. Additionally, the effect of 2% beam energy measurement uncertainty and possible non-zero beam angular spread on MC dosemore » distributions was evaluated. Results: MC simulated percentage depth dose (PDD) curves agreed with measurements within 4% for all beam sizes at both 50 and 70 MeV VHEE beams. Central axis PDD at 8 cm depth ranged from 14% to 19% for the 5.4–6.9 mm 50 MeV beams and it ranged from 14% to 18% for the 4.0–4.5 mm 70 MeV beams. MC simulated relative beam profiles of regularly shaped Gaussian beams evaluated at depths of 0.64 to 7.46 cm agreed with measurements to within 5%. A 2% beam energy uncertainty and 0.286° beam angular spread corresponded to a maximum 3.0% and 3.8% difference in depth dose curves of the 50 and 70 MeV electron beams, respectively. Absolute dose differences between MC simulations and film measurements of regularly shaped Gaussian beams were between 10% and 42%. Conclusions: The authors demonstrate that relative dose distributions for VHEE beams of 50–70 MeV can be measured with Gafchromic films and modeled with Monte Carlo simulations to an accuracy of 5%. The reported absolute dose differences likely caused by imperfect beam steering and subsequent charge loss revealed the importance of accurate VHEE beam control and diagnostics.« less
Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald
2010-05-01
To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.
Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R
2014-03-01
Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Santander, Julian E; Tsapatsis, Michael; Auerbach, Scott M
2013-04-16
We have constructed and applied an algorithm to simulate the behavior of zeolite frameworks during liquid adsorption. We applied this approach to compute the adsorption isotherms of furfural-water and hydroxymethyl furfural (HMF)-water mixtures adsorbing in silicalite zeolite at 300 K for comparison with experimental data. We modeled these adsorption processes under two different statistical mechanical ensembles: the grand canonical (V-Nz-μg-T or GC) ensemble keeping volume fixed, and the P-Nz-μg-T (osmotic) ensemble allowing volume to fluctuate. To optimize accuracy and efficiency, we compared pure Monte Carlo (MC) sampling to hybrid MC-molecular dynamics (MD) simulations. For the external furfural-water and HMF-water phases, we assumed the ideal solution approximation and employed a combination of tabulated data and extended ensemble simulations for computing solvation free energies. We found that MC sampling in the V-Nz-μg-T ensemble (i.e., standard GCMC) does a poor job of reproducing both the Henry's law regime and the saturation loadings of these systems. Hybrid MC-MD sampling of the V-Nz-μg-T ensemble, which includes framework vibrations at fixed total volume, provides better results in the Henry's law region, but this approach still does not reproduce experimental saturation loadings. Pure MC sampling of the osmotic ensemble was found to approach experimental saturation loadings more closely, whereas hybrid MC-MD sampling of the osmotic ensemble quantitatively reproduces such loadings because the MC-MD approach naturally allows for locally anisotropic volume changes wherein some pores expand whereas others contract.
Simulating x-ray telescopes with McXtrace: a case study of ATHENA's optics
NASA Astrophysics Data System (ADS)
Ferreira, Desiree D. M.; Knudsen, Erik B.; Westergaard, Niels J.; Christensen, Finn E.; Massahi, Sonny; Shortt, Brian; Spiga, Daniele; Solstad, Mathias; Lefmann, Kim
2016-07-01
We use the X-ray ray-tracing package McXtrace to simulate the performance of X-ray telescopes based on Silicon Pore Optics (SPO) technologies. We use as reference the design of the optics of the planned X-ray mission Advanced Telescope for High ENergy Astrophysics (ATHENA) which is designed as a single X-ray telescope populated with stacked SPO substrates forming mirror modules to focus X-ray photons. We show that is possible to simulate in detail the SPO pores and qualify the use of McXtrace for in-depth analysis of in-orbit performance and laboratory X-ray test results.
NASA Astrophysics Data System (ADS)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Sampling Enrichment toward Target Structures Using Hybrid Molecular Dynamics-Monte Carlo Simulations
Yang, Kecheng; Różycki, Bartosz; Cui, Fengchao; Shi, Ce; Chen, Wenduo; Li, Yunqi
2016-01-01
Sampling enrichment toward a target state, an analogue of the improvement of sampling efficiency (SE), is critical in both the refinement of protein structures and the generation of near-native structure ensembles for the exploration of structure-function relationships. We developed a hybrid molecular dynamics (MD)-Monte Carlo (MC) approach to enrich the sampling toward the target structures. In this approach, the higher SE is achieved by perturbing the conventional MD simulations with a MC structure-acceptance judgment, which is based on the coincidence degree of small angle x-ray scattering (SAXS) intensity profiles between the simulation structures and the target structure. We found that the hybrid simulations could significantly improve SE by making the top-ranked models much closer to the target structures both in the secondary and tertiary structures. Specifically, for the 20 mono-residue peptides, when the initial structures had the root-mean-squared deviation (RMSD) from the target structure smaller than 7 Å, the hybrid MD-MC simulations afforded, on average, 0.83 Å and 1.73 Å in RMSD closer to the target than the parallel MD simulations at 310K and 370K, respectively. Meanwhile, the average SE values are also increased by 13.2% and 15.7%. The enrichment of sampling becomes more significant when the target states are gradually detectable in the MD-MC simulations in comparison with the parallel MD simulations, and provide >200% improvement in SE. We also performed a test of the hybrid MD-MC approach in the real protein system, the results showed that the SE for 3 out of 5 real proteins are improved. Overall, this work presents an efficient way of utilizing solution SAXS to improve protein structure prediction and refinement, as well as the generation of near native structures for function annotation. PMID:27227775
Yang, Kecheng; Różycki, Bartosz; Cui, Fengchao; Shi, Ce; Chen, Wenduo; Li, Yunqi
2016-01-01
Sampling enrichment toward a target state, an analogue of the improvement of sampling efficiency (SE), is critical in both the refinement of protein structures and the generation of near-native structure ensembles for the exploration of structure-function relationships. We developed a hybrid molecular dynamics (MD)-Monte Carlo (MC) approach to enrich the sampling toward the target structures. In this approach, the higher SE is achieved by perturbing the conventional MD simulations with a MC structure-acceptance judgment, which is based on the coincidence degree of small angle x-ray scattering (SAXS) intensity profiles between the simulation structures and the target structure. We found that the hybrid simulations could significantly improve SE by making the top-ranked models much closer to the target structures both in the secondary and tertiary structures. Specifically, for the 20 mono-residue peptides, when the initial structures had the root-mean-squared deviation (RMSD) from the target structure smaller than 7 Å, the hybrid MD-MC simulations afforded, on average, 0.83 Å and 1.73 Å in RMSD closer to the target than the parallel MD simulations at 310K and 370K, respectively. Meanwhile, the average SE values are also increased by 13.2% and 15.7%. The enrichment of sampling becomes more significant when the target states are gradually detectable in the MD-MC simulations in comparison with the parallel MD simulations, and provide >200% improvement in SE. We also performed a test of the hybrid MD-MC approach in the real protein system, the results showed that the SE for 3 out of 5 real proteins are improved. Overall, this work presents an efficient way of utilizing solution SAXS to improve protein structure prediction and refinement, as well as the generation of near native structures for function annotation.
In-simulator training of driving abilities in a person with a traumatic brain injury.
Gamache, Pierre-Luc; Lavallière, Martin; Tremblay, Mathieu; Simoneau, Martin; Teasdale, Normand
2011-01-01
This study reports the case of a 23-year-old woman (MC) who sustained a severe traumatic brain injury in 2004. After her accident, her driving license was revoked. Despite recovering normal neuropsychological functions in the following years, MC was unable to renew her license, failing four on-road evaluations assessing her fitness to drive. In hope of an eventual license renewal, MC went through an in-simulator training programme in the laboratory in 2009. The training programme aimed at improving features of MC's driving behaviour that were identified as being problematic in prior on-road evaluations. To do so, proper driving behaviour was reinforced via driving-specific feedback provided during the training sessions. After 25 sessions in the simulator (over a period of 4 months), MC significantly improved various components of her driving. Notably, compared to early sessions, later ones were associated with a reduced cognitive load, less jerky speed profiles when stopping at intersections and better vehicle control and positioning. A 1-year retention test showed most of these improvements were consistent. The learning principles underlying well conducted simulator-based education programmes have a strong scientific basis. A simulator training programme like this one represents a promising avenue for driving rehabilitation. It allows individuals without a driving license to practice and improve their skills in a safe and realistic environment.
Deviation from equilibrium conditions in molecular dynamic simulations of homogeneous nucleation.
Halonen, Roope; Zapadinsky, Evgeni; Vehkamäki, Hanna
2018-04-28
We present a comparison between Monte Carlo (MC) results for homogeneous vapour-liquid nucleation of Lennard-Jones clusters and previously published values from molecular dynamics (MD) simulations. Both the MC and MD methods sample real cluster configuration distributions. In the MD simulations, the extent of the temperature fluctuation is usually controlled with an artificial thermostat rather than with more realistic carrier gas. In this study, not only a primarily velocity scaling thermostat is considered, but also Nosé-Hoover, Berendsen, and stochastic Langevin thermostat methods are covered. The nucleation rates based on a kinetic scheme and the canonical MC calculation serve as a point of reference since they by definition describe an equilibrated system. The studied temperature range is from T = 0.3 to 0.65 ϵ/k. The kinetic scheme reproduces well the isothermal nucleation rates obtained by Wedekind et al. [J. Chem. Phys. 127, 064501 (2007)] using MD simulations with carrier gas. The nucleation rates obtained by artificially thermostatted MD simulations are consistently lower than the reference nucleation rates based on MC calculations. The discrepancy increases up to several orders of magnitude when the density of the nucleating vapour decreases. At low temperatures, the difference to the MC-based reference nucleation rates in some cases exceeds the maximal nonisothermal effect predicted by classical theory of Feder et al. [Adv. Phys. 15, 111 (1966)].
Deviation from equilibrium conditions in molecular dynamic simulations of homogeneous nucleation
NASA Astrophysics Data System (ADS)
Halonen, Roope; Zapadinsky, Evgeni; Vehkamäki, Hanna
2018-04-01
We present a comparison between Monte Carlo (MC) results for homogeneous vapour-liquid nucleation of Lennard-Jones clusters and previously published values from molecular dynamics (MD) simulations. Both the MC and MD methods sample real cluster configuration distributions. In the MD simulations, the extent of the temperature fluctuation is usually controlled with an artificial thermostat rather than with more realistic carrier gas. In this study, not only a primarily velocity scaling thermostat is considered, but also Nosé-Hoover, Berendsen, and stochastic Langevin thermostat methods are covered. The nucleation rates based on a kinetic scheme and the canonical MC calculation serve as a point of reference since they by definition describe an equilibrated system. The studied temperature range is from T = 0.3 to 0.65 ɛ/k. The kinetic scheme reproduces well the isothermal nucleation rates obtained by Wedekind et al. [J. Chem. Phys. 127, 064501 (2007)] using MD simulations with carrier gas. The nucleation rates obtained by artificially thermostatted MD simulations are consistently lower than the reference nucleation rates based on MC calculations. The discrepancy increases up to several orders of magnitude when the density of the nucleating vapour decreases. At low temperatures, the difference to the MC-based reference nucleation rates in some cases exceeds the maximal nonisothermal effect predicted by classical theory of Feder et al. [Adv. Phys. 15, 111 (1966)].
Availability Estimation for Facilities in Extreme Geographical Locations
NASA Technical Reports Server (NTRS)
Fischer, Gerd M.; Omotoso, Oluseun; Chen, Guangming; Evans, John W.
2012-01-01
A value added analysis for the Reliability. Availability and Maintainability of McMurdo Ground Station was developed, which will be a useful tool for system managers in sparing, maintenance planning and determining vital performance metrics needed for readiness assessment of the upgrades to the McMurdo System. Output of this study can also be used as inputs and recommendations for the application of Reliability Centered Maintenance (RCM) for the system. ReliaSoft's BlockSim. a commercial Reliability Analysis software package, has been used to model the availability of the system upgrade to the National Aeronautics and Space Administration (NASA) Near Earth Network (NEN) Ground Station at McMurdo Station in the Antarctica. The logistics challenges due to the closure of access to McMurdo Station during the Antarctic winter was modeled using a weighted composite of four Weibull distributions. one of the possible choices for statistical distributions throughout the software program and usually used to account for failure rates of components supplied by different manufacturers. The inaccessibility of the antenna site on a hill outside McMurdo Station throughout one year due to severe weather was modeled with a Weibull distribution for the repair crew availability. The Weibull distribution is based on an analysis of the available weather data for the antenna site for 2007 in combination with the rules for travel restrictions due to severe weather imposed by the administrating agency, the National Science Foundation (NSF). The simulations resulted in an upper bound for the system availability and allowed for identification of components that would improve availability based on a higher on-site spare count than initially planned.
NASA Astrophysics Data System (ADS)
Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R.; St. James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles
2017-10-01
RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within ±3% and distal fall-off to within 2 mm. In an anthropomorphic phantom, the gamma index (dose tolerance = 3%, distance-to-agreement = 3 mm) was greater than 90% for six out of seven planes using the RS-MC, and three out seven for the RS-PBA. The RS-MC algorithm demonstrated improved dosimetric accuracy over the RS-PBA in the presence of homogenous, heterogeneous and anthropomorphic phantoms. The computation performance of the RS-MC was similar to the RS-PBA algorithm. For complex disease sites like breast, head and neck, and lung cancer, the RS-MC algorithm will provide significantly more accurate treatment planning.
Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R; St James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles
2017-09-12
RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within ±3% and distal fall-off to within 2 mm. In an anthropomorphic phantom, the gamma index (dose tolerance = 3%, distance-to-agreement = 3 mm) was greater than 90% for six out of seven planes using the RS-MC, and three out seven for the RS-PBA. The RS-MC algorithm demonstrated improved dosimetric accuracy over the RS-PBA in the presence of homogenous, heterogeneous and anthropomorphic phantoms. The computation performance of the RS-MC was similar to the RS-PBA algorithm. For complex disease sites like breast, head and neck, and lung cancer, the RS-MC algorithm will provide significantly more accurate treatment planning.
Lens implementation on the GATE Monte Carlo toolkit for optical imaging simulation
NASA Astrophysics Data System (ADS)
Kang, Han Gyu; Song, Seong Hyun; Han, Young Been; Kim, Kyeong Min; Hong, Seong Jong
2018-02-01
Optical imaging techniques are widely used for in vivo preclinical studies, and it is well known that the Geant4 Application for Emission Tomography (GATE) can be employed for the Monte Carlo (MC) modeling of light transport inside heterogeneous tissues. However, the GATE MC toolkit is limited in that it does not yet include optical lens implementation, even though this is required for a more realistic optical imaging simulation. We describe our implementation of a biconvex lens into the GATE MC toolkit to improve both the sensitivity and spatial resolution for optical imaging simulation. The lens implemented into the GATE was validated against the ZEMAX optical simulation using an US air force 1951 resolution target. The ray diagrams and the charge-coupled device images of the GATE optical simulation agreed with the ZEMAX optical simulation results. In conclusion, the use of a lens on the GATE optical simulation could improve the image quality of bioluminescence and fluorescence significantly as compared with pinhole optics.
Monte Carlo Simulations: Number of Iterations and Accuracy
2015-07-01
iterations because of its added complexity compared to the WM . We recommend that the WM be used for a priori estimates of the number of MC ...inaccurate.15 Although the WM and the WSM have generally proven useful in estimating the number of MC iterations and addressing the accuracy of the MC ...Theorem 3 3. A Priori Estimate of Number of MC Iterations 7 4. MC Result Accuracy 11 5. Using Percentage Error of the Mean to Estimate Number of MC
Automated Concurrent Blackboard System Generation in C++
NASA Technical Reports Server (NTRS)
Kaplan, J. A.; McManus, J. W.; Bynum, W. L.
1999-01-01
In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less
A comparison of Monte-Carlo simulations using RESTRAX and McSTAS with experiment on IN14
NASA Astrophysics Data System (ADS)
Wildes, A. R.; S̆aroun, J.; Farhi, E.; Anderson, I.; Høghøj, P.; Brochier, A.
2000-03-01
Monte-Carlo simulations of a focusing supermirror guide after the monochromator on the IN14 cold neutron three-axis spectrometer, I.L.L. were carried out using the instrument simulation programs RESTRAX and McSTAS. The simulations were compared to experiment to check their accuracy. Comparisons of the flux ratios over both a 100 and a 1600 mm 2 area at the sample position compare well, and there is a very close agreement between simulation and experiment for the energy spread of the incident beam.
Kim, K B; Shanyfelt, L M; Hahn, D W
2006-01-01
Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.
Solar Proton Transport within an ICRU Sphere Surrounded by a Complex Shield: Combinatorial Geometry
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
The 3DHZETRN code, with improved neutron and light ion (Z (is) less than 2) transport procedures, was recently developed and compared to Monte Carlo (MC) simulations using simplified spherical geometries. It was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in general combinatorial geometry. A more complex shielding structure with internal parts surrounding a tissue sphere is considered and compared against MC simulations. It is shown that even in the more complex geometry, 3DHZETRN agrees well with the MC codes and maintains a high degree of computational efficiency.
OneSAF as an In-Stride Mission Command Asset
2014-06-01
implementation approach. While DARPA began with a funded project to complete the capability as a “ big bang ” approach the approach here is based on reuse and...Command (MC), Modeling and Simulation (M&S), Distributed Interactive Simulation (DIS) ABSTRACT: To provide greater interoperability and integration...within Mission Command (MC) Systems the One Semi-Automated Forces (OneSAF) entity level simulation is evolving from a tightly coupled client server
Computer Simulation of Electron Thermalization in CsI and CsI(Tl)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhiguo; Xie, YuLong; Cannon, Bret D.
2011-09-15
A Monte Carlo (MC) model was developed and implemented to simulate the thermalization of electrons in inorganic scintillator materials. The model incorporates electron scattering with both longitudinal optical and acoustic phonons. In this paper, the MC model was applied to simulate electron thermalization in CsI, both pure and doped with a range of thallium concentrations. The inclusion of internal electric fields was shown to increase the fraction of recombined electron-hole pairs and to broaden the thermalization distance and thermalization time distributions. The MC simulations indicate that electron thermalization, following {gamma}-ray excitation, takes place within approximately 10 ps in CsI andmore » that electrons can travel distances up to several hundreds of nanometers. Electron thermalization was studied for a range of incident {gamma}-ray energies using electron-hole pair spatial distributions generated by the MC code NWEGRIM (NorthWest Electron and Gamma Ray Interaction in Matter). These simulations revealed that the partition of thermalized electrons between different species (e.g., recombined with self-trapped holes or trapped at thallium sites) vary with the incident energy. Implications for the phenomenon of nonlinearity in scintillator light yield are discussed.« less
Chen, Yunjie; Roux, Benoît
2014-09-21
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
NASA Astrophysics Data System (ADS)
Chen, Yunjie; Roux, Benoît
2014-09-01
Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.
Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code
NASA Astrophysics Data System (ADS)
Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.
2015-08-01
MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.
Kinetic Monte Carlo (kMC) simulation of carbon co-implant on pre-amorphization process.
Park, Soonyeol; Cho, Bumgoo; Yang, Seungsu; Won, Taeyoung
2010-05-01
We report our kinetic Monte Carlo (kMC) study of the effect of carbon co-implant on the pre-amorphization implant (PAL) process. We employed BCA (Binary Collision Approximation) approach for the acquisition of the initial as-implant dopant profile and kMC method for the simulation of diffusion process during the annealing process. The simulation results implied that carbon co-implant suppresses the boron diffusion due to the recombination with interstitials. Also, we could compare the boron diffusion with carbon diffusion by calculating carbon reaction with interstitial. And we can find that boron diffusion is affected from the carbon co-implant energy by enhancing the trapping of interstitial between boron and interstitial.
Wen, Jiayi; Zhou, Shenggao; Xu, Zhenli; Li, Bo
2013-01-01
Competitive adsorption of counterions of multiple species to charged surfaces is studied by a size-effect included mean-field theory and Monte Carlo (MC) simulations. The mean-field electrostatic free-energy functional of ionic concentrations, constrained by Poisson’s equation, is numerically minimized by an augmented Lagrangian multiplier method. Unrestricted primitive models and canonical ensemble MC simulations with the Metropolis criterion are used to predict the ionic distributions around a charged surface. It is found that, for a low surface charge density, the adsorption of ions with a higher valence is preferable, agreeing with existing studies. For a highly charged surface, both of the mean-field theory and MC simulations demonstrate that the counterions bind tightly around the charged surface, resulting in a stratification of counterions of different species. The competition between mixed entropy and electrostatic energetics leads to a compromise that the ionic species with a higher valence-to-volume ratio has a larger probability to form the first layer of stratification. In particular, the MC simulations confirm the crucial role of ionic valence-to-volume ratios in the competitive adsorption to charged surfaces that had been previously predicted by the mean-field theory. The charge inversion for ionic systems with salt is predicted by the MC simulations but not by the mean-field theory. This work provides a better understanding of competitive adsorption of counterions to charged surfaces and calls for further studies on the ionic size effect with application to large-scale biomolecular modeling. PMID:22680474
Wen, Jiayi; Zhou, Shenggao; Xu, Zhenli; Li, Bo
2012-04-01
Competitive adsorption of counterions of multiple species to charged surfaces is studied by a size-effect-included mean-field theory and Monte Carlo (MC) simulations. The mean-field electrostatic free-energy functional of ionic concentrations, constrained by Poisson's equation, is numerically minimized by an augmented Lagrangian multiplier method. Unrestricted primitive models and canonical ensemble MC simulations with the Metropolis criterion are used to predict the ionic distributions around a charged surface. It is found that, for a low surface charge density, the adsorption of ions with a higher valence is preferable, agreeing with existing studies. For a highly charged surface, both the mean-field theory and the MC simulations demonstrate that the counterions bind tightly around the charged surface, resulting in a stratification of counterions of different species. The competition between mixed entropy and electrostatic energetics leads to a compromise that the ionic species with a higher valence-to-volume ratio has a larger probability to form the first layer of stratification. In particular, the MC simulations confirm the crucial role of ionic valence-to-volume ratios in the competitive adsorption to charged surfaces that had been previously predicted by the mean-field theory. The charge inversion for ionic systems with salt is predicted by the MC simulations but not by the mean-field theory. This work provides a better understanding of competitive adsorption of counterions to charged surfaces and calls for further studies on the ionic size effect with application to large-scale biomolecular modeling.
CloudMC: a cloud computing application for Monte Carlo simulation.
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-04-21
This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.
Hagos, Samson M.; Zhang, Chidong; Feng, Zhe; ...
2016-09-19
Influences of the diurnal cycle of convection on the propagation of the Madden-Julian Oscillation (MJO) across the Maritime Continent (MC) are examined using cloud-permitting regional model simulations and observations. A pair of ensembles of control (CONTROL) and no-diurnal cycle (NODC) simulations of the November 2011 MJO episode are performed. In the CONTROL simulations, the MJO signal is weakened as it propagates across the MC, with much of the convection stalling over the large islands of Sumatra and Borneo. In the NODC simulations, where the incoming shortwave radiation at the top of the atmosphere is maintained at its daily mean value,more » the MJO signal propagating across the MC is enhanced. Examination of the surface energy fluxes in the simulations indicates that in the presence of the diurnal cycle, surface downwelling shortwave radiation in CONTROL simulations is larger because clouds preferentially form in the afternoon. Furthermore, the diurnal co-variability of surface wind speed and skin temperature results in a larger sensible heat flux and a cooler land surface in CONTROL compared to NODC simulations. Here, an analysis of observations indicates that the modulation of the downwelling shortwave radiation at the surface by the diurnal cycle of cloudiness negatively projects on the MJO intraseasonal cycle and therefore disrupts the propagation of the MJO across the MC.« less
Mocking the weak lensing universe: The LensTools Python computing package
NASA Astrophysics Data System (ADS)
Petri, A.
2016-10-01
We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.
Picking the Best from the All-Resources Menu: Advanced Tools for Resource Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan S
Introduces the wide range of electric power systems modeling types and associated questions they can help answer. The presentation focusses on modeling needs for high levels of Distributed Energy Resources (DERs), renewables, and inverter-based technologies as alternatives to traditional centralized power systems. Covers Dynamics, Production Cost/QSTS, Metric Assessment, Resource Planning, and Integrated Simulations with examples drawn from NREL's past and on-going projects. Presented at the McKnight Foundation workshop on 'An All-Resources Approach to Planning for a More Dynamic, Low-Carbon Grid' exploring grid modernization options to replace retiring coal plants in Minnesota.
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Rovira, I.; Sempau, J.; Prezado, Y.
Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct featuresmore » of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. Conclusions: The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.« less
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy.
Martinez-Rovira, I; Sempau, J; Prezado, Y
2012-05-01
Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.
Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation.
Ziegenhein, Peter; Pirner, Sven; Ph Kamerling, Cornelis; Oelfke, Uwe
2015-08-07
Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37[Formula: see text] compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25[Formula: see text] and 1.95[Formula: see text] faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.
Yamada, Tomonori; Shimura, Takaya; Ebi, Masahide; Hirata, Yoshikazu; Nishiwaki, Hirotaka; Mizushima, Takashi; Asukai, Koki; Togawa, Shozo; Takahashi, Satoru; Joh, Takashi
2015-01-01
Our recent prospective study found equivalent accuracy of magnifying chromoendoscopy (MC) and endoscopic ultrasonography (EUS) for diagnosing the invasion depth of colorectal cancer (CRC); however, whether these tools show diagnostic differences in categories such as tumor size and morphology remains unclear. Hence, we conducted detailed subset analysis of the prospective data. In this multicenter, prospective, comparative trial, a total of 70 patients with early, flat CRC were enrolled from February 2011 to December 2012, and the results of 66 lesions were finally analyzed. Patients were randomly allocated to primary MC followed by EUS or to primary EUS followed by MC. Diagnoses of invasion depth by each tool were divided into intramucosal to slight submucosal invasion (invasion depth <1000 μm) and deep submucosal invasion (invasion depth ≥1000 μm), and then compared with the final pathological diagnosis by an independent pathologist blinded to clinical data. To standardize diagnoses among examiners, this trial was started after achievement of a mean κ value of ≥0.6 which was calculated from the average of κ values between each pair of participating endoscopists. Both MC and EUS showed similar diagnostic outcomes, with no significant differences in prediction of invasion depth in subset analyses according to tumor size, location, and morphology. Lesions that were consistently diagnosed as Tis/T1-SMS or ≥T1-SMD with both tools revealed accuracy of 76-78%. Accuracy was low in borderline lesions with irregular pit pattern in MC and distorted findings of the third layer in EUS (MC, 58.5%; EUS, 50.0%). MC and EUS showed the same limited accuracy for predicting invasion depth in all categories of early CRC. Since the irregular pit pattern in MC, distorted findings to the third layer in EUS and inconsistent diagnosis between both tools were associated with low accuracy, further refinements or even novel methods are still needed for such lesions. University hospital Medical Information Network Clinical Trials Registry UMIN 000005085.
NASA Astrophysics Data System (ADS)
Isobe, Masaharu
Hard sphere/disk systems are among the simplest models and have been used to address numerous fundamental problems in the field of statistical physics. The pioneering numerical works on the solid-fluid phase transition based on Monte Carlo (MC) and molecular dynamics (MD) methods published in 1957 represent historical milestones, which have had a significant influence on the development of computer algorithms and novel tools to obtain physical insights. This chapter addresses the works of Alder's breakthrough regarding hard sphere/disk simulation: (i) event-driven molecular dynamics, (ii) long-time tail, (iii) molasses tail, and (iv) two-dimensional melting/crystallization. From a numerical viewpoint, there are serious issues that must be overcome for further breakthrough. Here, we present a brief review of recent progress in this area.
NASA Astrophysics Data System (ADS)
Ong, J. S. L.; Charin, C.; Leong, J. H.
2017-12-01
Avalanche photodiodes (APDs) with steep electric field gradients generally have low excess noise that arises from carrier multiplication within the internal gain of the devices, and the Monte Carlo (MC) method is among popular device simulation tools for such devices. However, there are few articles relating to carrier trajectory modeling in MC models for such devices. In this work, a set of electric-field-gradient-dependent carrier trajectory tracking equations are developed and used to update the positions of carriers along the path during Simple-band Monte Carlo (SMC) simulations of APDs with non-uniform electric fields. The mean gain and excess noise results obtained from the SMC model employing these equations show good agreement with the results reported for a series of silicon diodes, including a p+n diode with steep electric field gradients. These results confirm the validity and demonstrate the feasibility of the trajectory tracking equations applied in SMC models for simulating mean gain and excess noise in APDs with non-uniform electric fields. Also, the simulation results of mean gain, excess noise, and carrier ionization positions obtained from the SMC model of this work agree well with those of the conventional SMC model employing the concept of a uniform electric field within a carrier free-flight. These results demonstrate that the electric field variation within a carrier free-flight has an insignificant effect on the predicted mean gain and excess noise results. Therefore, both the SMC model of this work and the conventional SMC model can be used to predict the mean gain and excess noise in APDs with highly non-uniform electric fields.
NASA Astrophysics Data System (ADS)
Hong, Qi-Jun; Liu, Zhi-Pan
2010-10-01
It has been a goal consistently pursued by chemists to understand and control the catalytic process over composite materials. In order to provide deeper insight on complex interfacial catalysis at the experimental conditions, we performed an extensive analysis on CO 2 hydrogenation over a Cu/ZrO 2 model catalyst by employing density functional theory (DFT) calculations and kinetic Monte Carlo (kMC) simulations based on the continuous stirred tank model. The free energy profiles are determined for the reaction at the oxygen-rich Cu/m-ZrO 2 (2̅12) interface, where all interfacial Zr are six-coordinated since the interface accumulates oxidative species at the reaction conditions. We show that not only methanol but also CO are produced through the formate pathway dominantly, whilst the reverse-water-gas-shift (RWGS) channel has only a minor contribution. H 2CO is a key intermediate species in the reaction pathway, the hydrogenation of which dictates the high temperature of CO 2 hydrogenation. The kinetics simulation shows that the CO 2 conversion is 1.20%, the selectivity towards methanol is 68% at 500 K and the activation energies for methanol and CO formation are 0.79 and 1.79 eV, respectively. The secondary reactions due to the product readsorption lower the overall turnover frequency (TOF) but increase the selectivity towards methanol by 16%. We also show that kMC is a more reliable tool for simulating heterogeneous catalytic processes compared to the microkinetics approach.
A Coarse Grained Model for Methylcellulose: Spontaneous Ring Formation at Elevated Temperature
NASA Astrophysics Data System (ADS)
Huang, Wenjun; Larson, Ronald
Methylcellulose (MC) is widely used as food additives and pharma applications, where its thermo-reversible gelation behavior plays an important role. To date the gelation mechanism is not well understood, and therefore attracts great research interest. In this study, we adopted coarse-grained (CG) molecular dynamics simulations to model the MC chains, including the homopolymers and random copolymers that models commercial METHOCEL A, in an implicit water environment, where each MC monomer modeled with a single bead. The simulations are carried using a LAMMPS program. We parameterized our CG model using the radial distribution functions from atomistic simulations of short MC oligomers, extrapolating the results to long chains. We used dissociation free energy to validate our CG model against the atomistic model. The CG model captured the effects of monomer substitution type and temperature from the atomistic simulations. We applied this CG model to simulate single chains up to 1000 monomers long and obtained persistence lengths that are close to those determined from experiment. We observed the chain collapse transition for random copolymer at 600 monomers long at 50C. The chain collapsed into a stable ring structure with outer diameter around 14nm, which appears to be a precursor to the fibril structure observed in the methylcellulose gel observed by Lodge et al. in the recent studies. Our CG model can be extended to other MC derivatives for studying the interaction between these polymers and small molecules, such as hydrophobic drugs.
Monte Carlo isotopic inventory analysis for complex nuclear systems
NASA Astrophysics Data System (ADS)
Phruksarojanakun, Phiphat
Monte Carlo Inventory Simulation Engine (MCise) is a newly developed method for calculating isotopic inventory of materials. It offers the promise of modeling materials with complex processes and irradiation histories, which pose challenges for current, deterministic tools, and has strong analogies to Monte Carlo (MC) neutral particle transport. The analog method, including considerations for simple, complex and loop flows, is fully developed. In addition, six variance reduction tools provide unique capabilities of MCise to improve statistical precision of MC simulations. Forced Reaction forces an atom to undergo a desired number of reactions in a given irradiation environment. Biased Reaction Branching primarily focuses on improving statistical results of the isotopes that are produced from rare reaction pathways. Biased Source Sampling aims at increasing frequencies of sampling rare initial isotopes as the starting particles. Reaction Path Splitting increases the population by splitting the atom at each reaction point, creating one new atom for each decay or transmutation product. Delta Tracking is recommended for high-frequency pulsing to reduce the computing time. Lastly, Weight Window is introduced as a strategy to decrease large deviations of weight due to the uses of variance reduction techniques. A figure of merit is necessary to compare the efficiency of different variance reduction techniques. A number of possibilities for figure of merit are explored, two of which are robust and subsequently used. One is based on the relative error of a known target isotope (1/R 2T) and the other on the overall detection limit corrected by the relative error (1/DkR 2T). An automated Adaptive Variance-reduction Adjustment (AVA) tool is developed to iteratively define parameters for some variance reduction techniques in a problem with a target isotope. Sample problems demonstrate that AVA improves both precision and accuracy of a target result in an efficient manner. Potential applications of MCise include molten salt fueled reactors and liquid breeders in fusion blankets. As an example, the inventory analysis of a liquid actinide fuel in the In-Zinerator, a sub-critical power reactor driven by a fusion source, is examined. The result reassures MCise as a reliable tool for inventory analysis of complex nuclear systems.
Diagnosing Undersampling in Monte Carlo Eigenvalue and Flux Tally Estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2015-01-01
This study explored the impact of undersampling on the accuracy of tally estimates in Monte Carlo (MC) calculations. Steady-state MC simulations were performed for models of several critical systems with varying degrees of spatial and isotopic complexity, and the impact of undersampling on eigenvalue and fuel pin flux/fission estimates was examined. This study observed biases in MC eigenvalue estimates as large as several percent and biases in fuel pin flux/fission tally estimates that exceeded tens, and in some cases hundreds, of percent. This study also investigated five statistical metrics for predicting the occurrence of undersampling biases in MC simulations. Threemore » of the metrics (the Heidelberger-Welch RHW, the Geweke Z-Score, and the Gelman-Rubin diagnostics) are commonly used for diagnosing the convergence of Markov chains, and two of the methods (the Contributing Particles per Generation and Tally Entropy) are new convergence metrics developed in the course of this study. These metrics were implemented in the KENO MC code within the SCALE code system and were evaluated for their reliability at predicting the onset and magnitude of undersampling biases in MC eigenvalue and flux tally estimates in two of the critical models. Of the five methods investigated, the Heidelberger-Welch RHW, the Gelman-Rubin diagnostics, and Tally Entropy produced test metrics that correlated strongly to the size of the observed undersampling biases, indicating their potential to effectively predict the size and prevalence of undersampling biases in MC simulations.« less
Lens implementation on the GATE Monte Carlo toolkit for optical imaging simulation.
Kang, Han Gyu; Song, Seong Hyun; Han, Young Been; Kim, Kyeong Min; Hong, Seong Jong
2018-02-01
Optical imaging techniques are widely used for in vivo preclinical studies, and it is well known that the Geant4 Application for Emission Tomography (GATE) can be employed for the Monte Carlo (MC) modeling of light transport inside heterogeneous tissues. However, the GATE MC toolkit is limited in that it does not yet include optical lens implementation, even though this is required for a more realistic optical imaging simulation. We describe our implementation of a biconvex lens into the GATE MC toolkit to improve both the sensitivity and spatial resolution for optical imaging simulation. The lens implemented into the GATE was validated against the ZEMAX optical simulation using an US air force 1951 resolution target. The ray diagrams and the charge-coupled device images of the GATE optical simulation agreed with the ZEMAX optical simulation results. In conclusion, the use of a lens on the GATE optical simulation could improve the image quality of bioluminescence and fluorescence significantly as compared with pinhole optics. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
The Monte Carlo simulation of the Borexino detector
NASA Astrophysics Data System (ADS)
Agostini, M.; Altenmüller, K.; Appel, S.; Atroshchenko, V.; Bagdasarian, Z.; Basilico, D.; Bellini, G.; Benziger, J.; Bick, D.; Bonfini, G.; Borodikhina, L.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Caminata, A.; Canepa, M.; Caprioli, S.; Carlini, M.; Cavalcante, P.; Chepurnov, A.; Choi, K.; D'Angelo, D.; Davini, S.; Derbin, A.; Ding, X. F.; Di Noto, L.; Drachnev, I.; Fomenko, K.; Formozov, A.; Franco, D.; Froborg, F.; Gabriele, F.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, T.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jany, A.; Jeschke, D.; Kobychev, V.; Korablev, D.; Korga, G.; Kryn, D.; Laubenstein, M.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Magnozzi, M.; Manuzio, G.; Marcocci, S.; Martyn, J.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Muratova, V.; Neumair, B.; Oberauer, L.; Opitz, B.; Ortica, F.; Pallavicini, M.; Papp, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Semenov, D.; Shakina, P.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Stokes, L. F. F.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Vishneva, A.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.
2018-01-01
We describe the Monte Carlo (MC) simulation of the Borexino detector and the agreement of its output with data. The Borexino MC "ab initio" simulates the energy loss of particles in all detector components and generates the resulting scintillation photons and their propagation within the liquid scintillator volume. The simulation accounts for absorption, reemission, and scattering of the optical photons and tracks them until they either are absorbed or reach the photocathode of one of the photomultiplier tubes. Photon detection is followed by a comprehensive simulation of the readout electronics response. The MC is tuned using data collected with radioactive calibration sources deployed inside and around the scintillator volume. The simulation reproduces the energy response of the detector, its uniformity within the fiducial scintillator volume relevant to neutrino physics, and the time distribution of detected photons to better than 1% between 100 keV and several MeV. The techniques developed to simulate the Borexino detector and their level of refinement are of possible interest to the neutrino community, especially for current and future large-volume liquid scintillator experiments such as Kamland-Zen, SNO+, and Juno.
Solar proton exposure of an ICRU sphere within a complex structure Part I: Combinatorial geometry.
Wilson, John W; Slaba, Tony C; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A
2016-06-01
The 3DHZETRN code, with improved neutron and light ion (Z≤2) transport procedures, was recently developed and compared to Monte Carlo (MC) simulations using simplified spherical geometries. It was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in general combinatorial geometry. A more complex shielding structure with internal parts surrounding a tissue sphere is considered and compared against MC simulations. It is shown that even in the more complex geometry, 3DHZETRN agrees well with the MC codes and maintains a high degree of computational efficiency. Published by Elsevier Ltd.
A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)
NASA Astrophysics Data System (ADS)
Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun
2015-09-01
Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.
A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).
Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun
2015-10-07
Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.
Obiero, Walter; Young, Marisa R; Bailey, Robert C
2013-01-01
Male circumcision (MC) reduces the risk of heterosexual HIV acquisition in men by approximately 60%. MC programs for HIV prevention are currently being scaled-up in fourteen countries in sub-Saharan Africa. The current standard surgical technique for MC in many sub-Saharan African countries is the forceps-guided male circumcision (FGMC) method. The PrePex male circumcision (PMC) method could replace FGMC and potentially reduce MC programming costs. We compared the potential costs of introducing the PrePex device into MC programming to the cost of the forceps-guided method. Data were obtained from the Nyanza Reproductive Health Society (NRHS), an MC service delivery organization in Kenya, and from the Kenya Ministry of Health. Analyses are based on 48,265 MC procedures performed in four Districts in western Kenya from 2009 through 2011. Data were entered into the WHO/UNAIDS Decision Makers Program Planning Tool. The tool assesses direct and indirect costs of MC programming. Various sensitivity analyses were performed. Costs were discounted at an annual rate of 6% and are presented in United States Dollars. Not including the costs of the PrePex device or referral costs for men with phimosis/tight foreskin, the costs of one MC surgery were $44.54-$49.02 and $54.52-$55.29 for PMC and FGMC, respectively. The PrePex device is unlikely to result in significant cost-savings in comparison to the forceps-guided method. MC programmers should target other aspects of the male circumcision minimum package for improved cost efficiency.
NASA Astrophysics Data System (ADS)
Korayem, A. H.; Abdi, M.; Korayem, M. H.
2018-06-01
The surface topography in nanoscale is one of the most important applications of AFM. The analysis of piezoelectric microcantilevers vibration behavior is essential to improve the AFM performance. To this end, one of the appropriate methods to simulate the dynamic behavior of microcantilever (MC) is a numerical solution with FEM in the 3D modeling using COMSOL software. The present study aims to simulate different geometries of the four-layered AFM piezoelectric MCs in 2D and 3D modeling in a liquid medium using COMSOL software. The 3D simulation was done in a spherical container using FSI domain in COMSOL. In 2D modeling by applying Hamilton's Principle based on Euler-Bernoulli Beam theory, the governing motion equation was derived and discretized with FEM. In this mode, the hydrodynamic force was assumed with a string of spheres. The effect of this force along with the squeezed-film force was considered on MC equations. The effect of fluid density and viscosity on the MC vibrations that immersed in different glycerin solutions was investigated in 2D and 3D modes and the results were compared with the experimental results. The frequencies and time responses of MC close to the surface were obtained considering tip-sample forces. The surface topography of MCs different geometries were compared in the liquid medium and the comparison was done in both tapping and non-contact mode. Various types of surface roughness were considered in the topography for MC different geometries. Also, the effect of geometric dimensions on the surface topography was investigated. In liquid medium, MC is installed at an oblique position to avoid damaging the MC due to the squeezed-film force in the vicinity of MC surface. Finally, the effect of MC's angle on surface topography and time response of the system was investigated.
NASA Astrophysics Data System (ADS)
Manganaro, L.; Russo, G.; Bourhaleb, F.; Fausti, F.; Giordanengo, S.; Monaco, V.; Sacchi, R.; Vignati, A.; Cirio, R.; Attili, A.
2018-04-01
One major rationale for the application of heavy ion beams in tumour therapy is their increased relative biological effectiveness (RBE). The complex dependencies of the RBE on dose, biological endpoint, position in the field etc require the use of biophysical models in treatment planning and clinical analysis. This study aims to introduce a new software, named ‘Survival’, to facilitate the radiobiological computations needed in ion therapy. The simulation toolkit was written in C++ and it was developed with a modular architecture in order to easily incorporate different radiobiological models. The following models were successfully implemented: the local effect model (LEM, version I, II and III) and variants of the microdosimetric-kinetic model (MKM). Different numerical evaluation approaches were also implemented: Monte Carlo (MC) numerical methods and a set of faster analytical approximations. Among the possible applications, the toolkit was used to reproduce the RBE versus LET for different ions (proton, He, C, O, Ne) and different cell lines (CHO, HSG). Intercomparison between different models (LEM and MKM) and computational approaches (MC and fast approximations) were performed. The developed software could represent an important tool for the evaluation of the biological effectiveness of charged particles in ion beam therapy, in particular when coupled with treatment simulations. Its modular architecture facilitates benchmarking and inter-comparison between different models and evaluation approaches. The code is open source (GPL2 license) and available at https://github.com/batuff/Survival.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
STS 51-L crewmembers during training session in flight deck simulation
NASA Technical Reports Server (NTRS)
1985-01-01
Shuttle mission simulator (SMS) scene of Astronauts Michael J. Smith, Ellison S. Onizuka, Judith A. Resnik, and Francis R. (Dick) Scobee in their launch and entry positions on the flight deck (46207); Left to right, Backup payload specialist Barbara R. Morgan, Teacher in Space Payload specialist Christa McAuliffe, Hughes Payload specialist Gregory B. Jarvis, and Mission Specialist Ronald E. McNair in the middeck portion of the Shuttle Mission Simulator at JSC (46208).
Improved Detection of Vowel Envelope Frequency Following Responses Using Hotelling's T2 Analysis.
Vanheusden, Frederique J; Bell, Steven L; Chesnaye, Michael A; Simpson, David M
2018-05-11
Objective detection of brainstem responses to natural speech stimuli is an important tool for the evaluation of hearing aid fitting, especially in people who may not be able to respond reliably in behavioral tests. Of particular interest is the envelope frequency following response (eFFR), which refers to the EEG response at the stimulus' fundamental frequency (and its harmonics), and here in particular to the response to natural spoken vowel sounds. This article introduces the frequency-domain Hotelling's T (HT2) method for eFFR detection. This method was compared, in terms of sensitivity in detecting eFFRs at the fundamental frequency (HT2_F0), to two different single-channel frequency domain methods (F test on Fourier analyzer (FA) amplitude spectra [FA-F-Test] and magnitude-squared coherence [MSC]) in detecting envelope following responses to natural vowel stimuli in simulated data and EEG data from normal-hearing subjects. Sensitivity was assessed based on the number of detections and the time needed to detect a response for a false-positive rate of 5%. The study also explored whether a single-channel, multifrequency HT2 (HT2_3F) and a multichannel, multifrequency HT2 (HT2_MC) could further improve response detection. Four repeated words were presented sequentially at 70 dB SPL LAeq through ER-2 insert earphones. The stimuli consisted of a prolonged vowel in a /hVd/ structure (where V represents different vowel sounds). Each stimulus was presented over 440 sweeps (220 condensation and 220 rarefaction). EEG data were collected from 12 normal-hearing adult participants. After preprocessing and artifact removal, eFFR detection was compared between the algorithms. For the simulation study, simulated EEG signals were generated by adding random noise at multiple signal to noise ratios (SNRs; 0 to -60dB) to the auditory stimuli as well as to a single sinusoid at the fluctuating and flattened fundamental frequency (f0). For each SNR, 1000 sets of 440 simulated epochs were generated. Performance of the algorithms was assessed based on the number of sets for which a response could be detected at each SNR. In simulation studies, HT2_3F significantly outperformed the other algorithms when detecting a vowel stimulus in noise. For simulations containing responses only at a single frequency, HT2_3F performs worse compared with other approaches applied in this study as the additional frequencies included do not contain additional information. For recorded EEG data, HT2_MC showed a significantly higher response detection rate compared with MSC and FA-F-Test. Both HT2_MC and HT2_F0 also showed a significant reduction in detection time compared with the FA-F-Test algorithm. Comparisons between different electrode locations confirmed a higher number of detections for electrodes close to Cz compared to more peripheral locations. The HT2 method is more sensitive than FA-F-Test and MSC in detecting responses to complex stimuli because it allows detection of multiple frequencies (HT2_F3) and multiple EEG channels (HT2_MC) simultaneously. This effect was shown in simulation studies for HT2_3F and in EEG data for the HT2_MC algorithm. The spread in detection time across subjects is also lower for the HT2 algorithm, with decision on the presence of an eFFR possible within 5 min.
A Detailed FLUKA-2005 Monte Carlo Simulation for the ATIC Detector
NASA Technical Reports Server (NTRS)
Gunasingha, R. M.; Fazely, A. R.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Batkov, K. E.; Chang, J.; Christl, M.; Ganel, O.; Guzik, T. G.
2006-01-01
We have performed a detailed Monte Carlo (MC) calculation for the Advanced thin Ionization Calorimeter (ATIC) detector using the MC code FLUKA-2005 which is capable of simulating particles up to 10 PeV. The ATIC detector has completed two successful balloon flights from McMurdo, Antarctica lasting a total of more than 35 days. ATIC is designed as a multiple, long duration balloon Bight, investigation of the cosmic ray spectra from below 50 GeV to near 100 TeV total energy; using a fully active Bismuth Germanate @GO) calorimeter. It is equipped with a large mosaic of silicon detector pixels capable of charge identification and as a particle tracking system, three projective layers of x-y scintillator hodoscopes were employed, above, in the middle and below a 0.75 nuclear interaction length graphite target. Our calculations are part of an analysis package of both A- and energy-dependences of different nuclei interacting with the ATIC detector. The MC simulates the responses of different components of the detector such as the Simatrix, the scintillator hodoscopes and the BGO calorimeter to various nuclei. We also show comparisons of the FLUKA-2005 MC calculations with a GEANT calculation and data for protons, He and CNO.
Molecular simulation of simple fluids and polymers in nanoconfinement
NASA Astrophysics Data System (ADS)
Rasmussen, Christopher John
Prediction of phase behavior and transport properties of simple fluids and polymers confined to nanoscale pores is important to a wide range of chemical and biochemical engineering processes. A practical approach to investigate nanoscale systems is molecular simulation, specifically Monte Carlo (MC) methods. One of the most challenging problems is the need to calculate chemical potentials in simulated phases. Through the seminal work of Widom, practitioners have a powerful method for calculating chemical potentials. Yet, this method fails for dense and inhomogeneous systems, as well as for complex molecules such as polymers. In this dissertation, the gauge cell MC method, which had previously been successfully applied to confined simple fluids, was employed and extended to investigate nanoscale fluids in several key areas. Firstly, the process of cavitation (the formation and growth of bubbles) during desorption of fluids from nanopores was investigated. The dependence of cavitation pressure on pore size was determined with gauge cell MC calculations of the nucleation barriers correlated with experimental data. Additional computational studies elucidated the role of surface defects and pore connectivity in the formation of cavitation bubbles. Secondly, the gauge cell method was extended to polymers. The method was verified against the literature results and found significantly more efficient. It was used to examine adsorption of polymers in nanopores. These results were applied to model the dynamics of translocation, the act of a polymer threading through a small opening, which is implicated in drug packaging and delivery, and DNA sequencing. Translocation dynamics was studied as diffusion along the free energy landscape. Thirdly, we show how computer simulation of polymer adsorption could shed light on the specifics of polymer chromatography, which is a key tool for the analysis and purification of polymers. The quality of separation depends on the physico-chemical mechanisms of polymer/pore interaction. We considered liquid chromatography at critical conditions, and calculated the dependence of the partition coefficient on chain length. Finally, solvent-gradient chromatography was modeled using a statistical model of polymer adsorption. A model for predicting separation of complex polymers (with functional groups or copolymers) was developed for practical use in chromatographic separations.
NEURAL NETWORK MODELLING OF CARDIAC DOSE CONVERSION COEFFICIENT FOR ARBITRARY X-RAY SPECTRA.
Kadri, O; Manai, K
2016-12-01
In this article, an approach to compute the dose conversion coefficients (DCCs) is described for the computational voxel phantom 'High-Definition Reference Korean-Man' (HDRK-Man) using artificial neural networks (ANN). For this purpose, the voxel phantom was implemented into the Monte Carlo (MC) transport toolkit GEANT4, and the DCCs for more than 30 tissues and organs, due to a broad parallel beam of monoenergetic photons with energy ranging from 15 to 150 keV by a step of 5 keV, were calculated. To study the influence of patient size on DCC values, DCC calculation was performed, for a representative body size population, using five different sizes covering the range of 80-120 % magnification of the original HDRK-Man. The focus of the present study was on the computation of DCC for the human heart. ANN calculation and MC simulation results were compared, and good agreement was observed showing that ANNs can be used as an efficient tool for modelling DCCs for the computational voxel phantom. ANN approach appears to be a significant advance over the time-consuming MC methods for DCC calculation. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Ustinov, E. A.
2017-01-01
The paper aims at a comparison of techniques based on the kinetic Monte Carlo (kMC) and the conventional Metropolis Monte Carlo (MC) methods as applied to the hard-sphere (HS) fluid and solid. In the case of the kMC, an alternative representation of the chemical potential is explored [E. A. Ustinov and D. D. Do, J. Colloid Interface Sci. 366, 216 (2012)], which does not require any external procedure like the Widom test particle insertion method. A direct evaluation of the chemical potential of the fluid and solid without thermodynamic integration is achieved by molecular simulation in an elongated box with an external potential imposed on the system in order to reduce the particle density in the vicinity of the box ends. The existence of rarefied zones allows one to determine the chemical potential of the crystalline phase and substantially increases its accuracy for the disordered dense phase in the central zone of the simulation box. This method is applicable to both the Metropolis MC and the kMC, but in the latter case, the chemical potential is determined with higher accuracy at the same conditions and the number of MC steps. Thermodynamic functions of the disordered fluid and crystalline face-centered cubic (FCC) phase for the hard-sphere system have been evaluated with the kinetic MC and the standard MC coupled with the Widom procedure over a wide range of density. The melting transition parameters have been determined by the point of intersection of the pressure-chemical potential curves for the disordered HS fluid and FCC crystal using the Gibbs-Duhem equation as a constraint. A detailed thermodynamic analysis of the hard-sphere fluid has provided a rigorous verification of the approach, which can be extended to more complex systems.
NASA Astrophysics Data System (ADS)
Drukker, Karen; Hammes-Schiffer, Sharon
1997-07-01
This paper presents an analytical derivation of a multiconfigurational self-consistent-field (MC-SCF) solution of the time-independent Schrödinger equation for nuclear motion (i.e. vibrational modes). This variational MC-SCF method is designed for the mixed quantum/classical molecular dynamics simulation of multiple proton transfer reactions, where the transferring protons are treated quantum mechanically while the remaining degrees of freedom are treated classically. This paper presents a proof that the Hellmann-Feynman forces on the classical degrees of freedom are identical to the exact forces (i.e. the Pulay corrections vanish) when this MC-SCF method is used with an appropriate choice of basis functions. This new MC-SCF method is applied to multiple proton transfer in a protonated chain of three hydrogen-bonded water molecules. The ground state and the first three excited state energies and the ground state forces agree well with full configuration interaction calculations. Sample trajectories are obtained using adiabatic molecular dynamics methods, and nonadiabatic effects are found to be insignificant for these sample trajectories. The accuracy of the excited states will enable this MC-SCF method to be used in conjunction with nonadiabatic molecular dynamics methods. This application differs from previous work in that it is a real-time quantum dynamical nonequilibrium simulation of multiple proton transfer in a chain of water molecules.
a Model to Simulate the Radiative Transfer of Fluorescence in a Leaf
NASA Astrophysics Data System (ADS)
Zhao, F.; Ni, Q.
2018-04-01
Light is reflected, transmitted and absorbed by green leaves. Chlorophyll fluorescence (ChlF) is the signal emitted by chlorophyll molecules in the leaf after the absorption of light. ChlF can be used as a direct probe of the functional status of photosynthetic machinery because of its close relationship with photosynthesis. The scattering, absorbing, and emitting properties of leaves are spectrally dependent, which can be simulated by modeling leaf-level fluorescence. In this paper, we proposed a Monte-Carlo (MC) model to simulate the radiative transfer of photons in the leaf. Results show that typical leaf fluorescence spectra can be properly simulated, with two peaks centered at around 685 nm in the red and 740 nm in the far-red regions. By analysing the sensitivity of the input parameters, we found the MC model can well simulate their influence on the emitted fluorescence. Meanwhile we compared results simulated by MC model with those by the Fluspect model. Generally they agree well in the far-red region but deviate in the red region.
"First-principles" kinetic Monte Carlo simulations revisited: CO oxidation over RuO2 (110).
Hess, Franziska; Farkas, Attila; Seitsonen, Ari P; Over, Herbert
2012-03-15
First principles-based kinetic Monte Carlo (kMC) simulations are performed for the CO oxidation on RuO(2) (110) under steady-state reaction conditions. The simulations include a set of elementary reaction steps with activation energies taken from three different ab initio density functional theory studies. Critical comparison of the simulation results reveals that already small variations in the activation energies lead to distinctly different reaction scenarios on the surface, even to the point where the dominating elementary reaction step is substituted by another one. For a critical assessment of the chosen energy parameters, it is not sufficient to compare kMC simulations only to experimental turnover frequency (TOF) as a function of the reactant feed ratio. More appropriate benchmarks for kMC simulations are the actual distribution of reactants on the catalyst's surface during steady-state reaction, as determined by in situ infrared spectroscopy and in situ scanning tunneling microscopy, and the temperature dependence of TOF in the from of Arrhenius plots. Copyright © 2012 Wiley Periodicals, Inc.
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sisniega, A; Zbijewski, W; Stayman, J
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced formore » additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain trauma imaging at the point of care in sports and military applications. Research grant from Carestream Health. JY is an employee of Carestream Health.« less
NASA Astrophysics Data System (ADS)
Xu, Jingjiang; Song, Shaozhen; Li, Yuandong; Wang, Ruikang K.
2018-01-01
Optical coherence tomography angiography (OCTA) is increasingly becoming a popular inspection tool for biomedical imaging applications. By exploring the amplitude, phase and complex information available in OCT signals, numerous algorithms have been proposed that contrast functional vessel networks within microcirculatory tissue beds. However, it is not clear which algorithm delivers optimal imaging performance. Here, we investigate systematically how amplitude and phase information have an impact on the OCTA imaging performance, to establish the relationship of amplitude and phase stability with OCT signal-to-noise ratio (SNR), time interval and particle dynamics. With either repeated A-scan or repeated B-scan imaging protocols, the amplitude noise increases with the increase of OCT SNR; however, the phase noise does the opposite, i.e. it increases with the decrease of OCT SNR. Coupled with experimental measurements, we utilize a simple Monte Carlo (MC) model to simulate the performance of amplitude-, phase- and complex-based algorithms for OCTA imaging, the results of which suggest that complex-based algorithms deliver the best performance when the phase noise is < ~40 mrad. We also conduct a series of in vivo vascular imaging in animal models and human retina to verify the findings from the MC model through assessing the OCTA performance metrics of vessel connectivity, image SNR and contrast-to-noise ratio. We show that for all the metrics assessed, the complex-based algorithm delivers better performance than either the amplitude- or phase-based algorithms for both the repeated A-scan and the B-scan imaging protocols, which agrees well with the conclusion drawn from the MC simulations.
Xu, Jingjiang; Song, Shaozhen; Li, Yuandong; Wang, Ruikang K
2017-12-19
Optical coherence tomography angiography (OCTA) is increasingly becoming a popular inspection tool for biomedical imaging applications. By exploring the amplitude, phase and complex information available in OCT signals, numerous algorithms have been proposed that contrast functional vessel networks within microcirculatory tissue beds. However, it is not clear which algorithm delivers optimal imaging performance. Here, we investigate systematically how amplitude and phase information have an impact on the OCTA imaging performance, to establish the relationship of amplitude and phase stability with OCT signal-to-noise ratio (SNR), time interval and particle dynamics. With either repeated A-scan or repeated B-scan imaging protocols, the amplitude noise increases with the increase of OCT SNR; however, the phase noise does the opposite, i.e. it increases with the decrease of OCT SNR. Coupled with experimental measurements, we utilize a simple Monte Carlo (MC) model to simulate the performance of amplitude-, phase- and complex-based algorithms for OCTA imaging, the results of which suggest that complex-based algorithms deliver the best performance when the phase noise is < ~40 mrad. We also conduct a series of in vivo vascular imaging in animal models and human retina to verify the findings from the MC model through assessing the OCTA performance metrics of vessel connectivity, image SNR and contrast-to-noise ratio. We show that for all the metrics assessed, the complex-based algorithm delivers better performance than either the amplitude- or phase-based algorithms for both the repeated A-scan and the B-scan imaging protocols, which agrees well with the conclusion drawn from the MC simulations.
Marker Configuration Model-Based Roentgen Fluoroscopic Analysis.
Garling, Eric H; Kaptein, Bart L; Geleijns, Koos; Nelissen, Rob G H H; Valstar, Edward R
2005-04-01
It remains unknown if and how the polyethylene bearing in mobile bearing knees moves during dynamic activities with respect to the tibial base plate. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis (MCM-based RFA) uses a marker configuration model of inserted tantalum markers in order to accurately estimate the pose of an implant or bone using single plane Roentgen images or fluoroscopic images. The goal of this study is to assess the accuracy of (MCM-Based RFA) in a standard fluoroscopic set-up using phantom experiments and to determine the error propagation with computer simulations. The experimental set-up of the phantom study was calibrated using a calibration box equipped with 600 tantalum markers, which corrected for image distortion and determined the focus position. In the computer simulation study the influence of image distortion, MC-model accuracy, focus position, the relative distance between MC-models and MC-model configuration on the accuracy of MCM-Based RFA were assessed. The phantom study established that the in-plane accuracy of MCM-Based RFA is 0.1 mm and the out-of-plane accuracy is 0.9 mm. The rotational accuracy is 0.1 degrees. A ninth-order polynomial model was used to correct for image distortion. Marker-Based RFA was estimated to have, in a worst case scenario, an in vivo translational accuracy of 0.14 mm (x-axis), 0.17 mm (y-axis), 1.9 mm (z-axis), respectively, and a rotational accuracy of 0.3 degrees. When using fluoroscopy to study kinematics, image distortion and the accuracy of models are important factors, which influence the accuracy of the measurements. MCM-Based RFA has the potential to be an accurate, clinically useful tool for studying kinematics after total joint replacement using standard equipment.
van den Oever, Huub L A; van Dam, Mirja; van 't Riet, Esther; Jansman, Frank G A
2017-02-01
Many patients with intentional drug overdose (IDO) are admitted to a medium (MC) or intensive care unit (IC) without ever requiring MC/IC related interventions. The objective of this study was to develop a decision tool, using parameters readily available in the emergency room (ER) for patients with an IDO, to identify patients requiring admission to a monitoring unit. Retrospective cohort study among cases of IDO with drugs having potentially acute effects on neurological, circulatory or ventilatory function, admitted to the MC/IC unit between 2007 and 2013. A decision tool was developed, using 6 criteria, representing intubation, breathing, oxygenation, cardiac conduction, blood pressure, and consciousness. Cases were labeled as 'high acuity' if one or more criteria were present. Among 255 cases of IDO that met the inclusion criteria, 197 were identified as "high acuity". Only 70 of 255 cases underwent one or more MC/IC related interventions, of which 67 were identified as 'high acuity by the decision tool (sensitivity 95.7%). In a population of patients with intentional drug overdose with agents having potentially acute effect on vital functions, 95.7% of MC/IC interventions could be predicted by clinical assessment, supplemented with electrocardiogram and blood gas analysis, in the ER. Copyright © 2016 Elsevier Inc. All rights reserved.
Contrast of Backscattered Electron SEM Images of Nanoparticles on Substrates with Complex Structure
Müller, Erich; Fritsch-Decker, Susanne; Hettler, Simon; Störmer, Heike; Weiss, Carsten; Gerthsen, Dagmar
2017-01-01
This study is concerned with backscattered electron scanning electron microscopy (BSE SEM) contrast of complex nanoscaled samples which consist of SiO2 nanoparticles (NPs) deposited on indium-tin-oxide covered bulk SiO2 and glassy carbon substrates. BSE SEM contrast of NPs is studied as function of the primary electron energy and working distance. Contrast inversions are observed which prevent intuitive interpretation of NP contrast in terms of material contrast. Experimental data is quantitatively compared with Monte-Carlo- (MC-) simulations. Quantitative agreement between experimental data and MC-simulations is obtained if the transmission characteristics of the annular semiconductor detector are taken into account. MC-simulations facilitate the understanding of NP contrast inversions and are helpful to derive conditions for optimum material and topography contrast. PMID:29109816
Contrast of Backscattered Electron SEM Images of Nanoparticles on Substrates with Complex Structure.
Kowoll, Thomas; Müller, Erich; Fritsch-Decker, Susanne; Hettler, Simon; Störmer, Heike; Weiss, Carsten; Gerthsen, Dagmar
2017-01-01
This study is concerned with backscattered electron scanning electron microscopy (BSE SEM) contrast of complex nanoscaled samples which consist of SiO 2 nanoparticles (NPs) deposited on indium-tin-oxide covered bulk SiO 2 and glassy carbon substrates. BSE SEM contrast of NPs is studied as function of the primary electron energy and working distance. Contrast inversions are observed which prevent intuitive interpretation of NP contrast in terms of material contrast. Experimental data is quantitatively compared with Monte-Carlo- (MC-) simulations. Quantitative agreement between experimental data and MC-simulations is obtained if the transmission characteristics of the annular semiconductor detector are taken into account. MC-simulations facilitate the understanding of NP contrast inversions and are helpful to derive conditions for optimum material and topography contrast.
NASA Astrophysics Data System (ADS)
Shin, Wook-Geun; Testa, Mauro; Kim, Hak Soo; Jeong, Jong Hwi; Byeong Lee, Se; Kim, Yeon-Joo; Min, Chul Hee
2017-10-01
For the independent validation of treatment plans, we developed a fully automated Monte Carlo (MC)-based patient dose calculation system with the tool for particle simulation (TOPAS) and proton therapy machine installed at the National Cancer Center in Korea to enable routine and automatic dose recalculation for each patient. The proton beam nozzle was modeled with TOPAS to simulate the therapeutic beam, and MC commissioning was performed by comparing percent depth dose with the measurement. The beam set-up based on the prescribed beam range and modulation width was automated by modifying the vendor-specific method. The CT phantom was modeled based on the DICOM CT files with TOPAS-built-in function, and an in-house-developed C++ code directly imports the CT files for positioning the CT phantom, RT-plan file for simulating the treatment plan, and RT-structure file for applying the Hounsfield unit (HU) assignment, respectively. The developed system was validated by comparing the dose distributions with those calculated by the treatment planning system (TPS) for a lung phantom and two patient cases of abdomen and internal mammary node. The results of the beam commissioning were in good agreement of up to 0.8 mm2 g-1 for B8 option in both of the beam range and the modulation width of the spread-out Bragg peaks. The beam set-up technique can predict the range and modulation width with an accuracy of 0.06% and 0.51%, respectively, with respect to the prescribed range and modulation in arbitrary points of B5 option (128.3, 132.0, and 141.2 mm2 g-1 of range). The dose distributions showed higher than 99% passing rate for the 3D gamma index (3 mm distance to agreement and 3% dose difference) between the MC simulations and the clinical TPS in the target volume. However, in the normal tissues, less favorable agreements were obtained for the radiation treatment planning with the lung phantom and internal mammary node cases. The discrepancies might come from the limitations of the clinical TPS, which is the inaccurate dose calculation algorithm for the scattering effect, in the range compensator and inhomogeneous material. Moreover, the steep slope of the compensator, conversion of the HU values to the human phantom, and the dose calculation algorithm for the HU assignment also could be reasons of the discrepancies. The current study could be used for the independent dose validation of treatment plans including high inhomogeneities, the steep compensator, and riskiness such as lung, head & neck cases. According to the treatment policy, the dose discrepancies predicted with MC could be used for the acceptance decision of the original treatment plan.
NASA Astrophysics Data System (ADS)
Matsui, T.; Dolan, B.; Tao, W. K.; Rutledge, S. A.; Iguchi, T.; Barnum, J. I.; Lang, S. E.
2017-12-01
This study presents polarimetric radar characteristics of intense convective cores derived from observations as well as a polarimetric-radar simulator from cloud resolving model (CRM) simulations from Midlatitude Continental Convective Clouds Experiment (MC3E) May 23 case over Oklahoma and a Tropical Warm Pool-International Cloud Experiment (TWP-ICE) Jan 23 case over Darwin, Australia to highlight the contrast between continental and maritime convection. The POLArimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a state-of-art T-matrix-Mueller-Matrix-based polarimetric radar simulator that can generate synthetic polarimetric radar signals (reflectivity, differential reflectivity, specific differential phase, co-polar correlation) as well as synthetic radar retrievals (precipitation, hydrometeor type, updraft velocity) through the consistent treatment of cloud microphysics and dynamics from CRMs. The Weather Research and Forecasting (WRF) model is configured to simulate continental and maritime severe storms over the MC3E and TWP-ICE domains with the Goddard bulk 4ICE single-moment microphysics and HUCM spectra-bin microphysics. Various statistical diagrams of polarimetric radar signals, hydrometeor types, updraft velocity, and precipitation intensity are investigated for convective and stratiform precipitation regimes and directly compared between MC3E and TWP-ICE cases. The result shows MC3E convection is characterized with very strong reflectivity (up to 60dBZ), slight negative differential reflectivity (-0.8 0 dB) and near-zero specific differential phase above the freezing levels. On the other hand, TWP-ICE convection shows strong reflectivity (up to 50dBZ), slight positive differential reflectivity (0 1.0 dB) and differential phase (0 0.8 dB/km). Hydrometeor IDentification (HID) algorithm from the observation and simulations detect hail-dominant convection core in MC3E, while graupel-dominant convection core in TWP-ICE. This land-ocean contrast agrees with the previous studies using the radar and radiometer signals from TRMM satellite climatology associated with warm-cloud depths and vertical structure of buoyancy.
Monte Carlo simulations to replace film dosimetry in IMRT verification.
Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.
NASA Astrophysics Data System (ADS)
Sud, Y. C.; Walker, G. K.
1999-09-01
A prognostic cloud scheme named McRAS (Microphysics of Clouds with Relaxed Arakawa-Schubert Scheme) has been designed and developed with the aim of improving moist processes, microphysics of clouds, and cloud-radiation interactions in GCMs. McRAS distinguishes three types of clouds: convective, stratiform, and boundary layer. The convective clouds transform and merge into stratiform clouds on an hourly timescale, while the boundary layer clouds merge into the stratiform clouds instantly. The cloud condensate converts into precipitation following the autoconversion equations of Sundqvist that contain a parametric adaptation for the Bergeron-Findeisen process of ice crystal growth and collection of cloud condensate by precipitation. All clouds convect, advect, as well as diffuse both horizontally and vertically with a fully interactive cloud microphysics throughout the life cycle of the cloud, while the optical properties of clouds are derived from the statistical distribution of hydrometeors and idealized cloud geometry.An evaluation of McRAS in a single-column model (SCM) with the Global Atmospheric Research Program Atlantic Tropical Experiment (GATE) Phase III data has shown that, together with the rest of the model physics, McRAS can simulate the observed temperature, humidity, and precipitation without discernible systematic errors. The time history and time mean in-cloud water and ice distribution, fractional cloudiness, cloud optical thickness, origin of precipitation in the convective anvils and towers, and the convective updraft and downdraft velocities and mass fluxes all simulate a realistic behavior. Some of these diagnostics are not verifiable with data on hand. These SCM sensitivity tests show that (i) without clouds the simulated GATE-SCM atmosphere is cooler than observed; (ii) the model's convective scheme, RAS, is an important subparameterization of McRAS; and (iii) advection of cloud water substance is helpful in simulating better cloud distribution and cloud-radiation interaction. An evaluation of the performance of McRAS in the Goddard Earth Observing System II GCM is given in a companion paper (Part II).
Orio, Patricio; Soudry, Daniel
2012-01-01
Background The phenomena that emerge from the interaction of the stochastic opening and closing of ion channels (channel noise) with the non-linear neural dynamics are essential to our understanding of the operation of the nervous system. The effects that channel noise can have on neural dynamics are generally studied using numerical simulations of stochastic models. Algorithms based on discrete Markov Chains (MC) seem to be the most reliable and trustworthy, but even optimized algorithms come with a non-negligible computational cost. Diffusion Approximation (DA) methods use Stochastic Differential Equations (SDE) to approximate the behavior of a number of MCs, considerably speeding up simulation times. However, model comparisons have suggested that DA methods did not lead to the same results as in MC modeling in terms of channel noise statistics and effects on excitability. Recently, it was shown that the difference arose because MCs were modeled with coupled gating particles, while the DA was modeled using uncoupled gating particles. Implementations of DA with coupled particles, in the context of a specific kinetic scheme, yielded similar results to MC. However, it remained unclear how to generalize these implementations to different kinetic schemes, or whether they were faster than MC algorithms. Additionally, a steady state approximation was used for the stochastic terms, which, as we show here, can introduce significant inaccuracies. Main Contributions We derived the SDE explicitly for any given ion channel kinetic scheme. The resulting generic equations were surprisingly simple and interpretable – allowing an easy, transparent and efficient DA implementation, avoiding unnecessary approximations. The algorithm was tested in a voltage clamp simulation and in two different current clamp simulations, yielding the same results as MC modeling. Also, the simulation efficiency of this DA method demonstrated considerable superiority over MC methods, except when short time steps or low channel numbers were used. PMID:22629320
SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folkerts, M; University of California, San Diego, La Jolla, CA; Graves, Y
Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is ablemore » to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.« less
The McGill simulator for endoscopic sinus surgery (MSESS): a validation study.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Saad, Elias; Funnell, W Robert J; Tewfik, Marc A
2014-10-24
Endoscopic sinus surgery (ESS) is a technically challenging procedure, associated with a significant risk of complications. Virtual reality simulation has demonstrated benefit in many disciplines as an important educational tool for surgical training. Within the field of rhinology, there is a lack of ESS simulators with appropriate validity evidence supporting their integration into residency education. The objectives of this study are to evaluate the acceptability, perceived realism and benefit of the McGill Simulator for Endoscopic Sinus Surgery (MSESS) among medical students, otolaryngology residents and faculty, and to present evidence supporting its ability to differentiate users based on their level of training through the performance metrics. 10 medical students, 10 junior residents, 10 senior residents and 3 expert sinus surgeons performed anterior ethmoidectomies, posterior ethmoidectomies and wide sphenoidotomies on the MSESS. Performance metrics related to quality (e.g. percentage of tissue removed), efficiency (e.g. time, path length, bimanual dexterity, etc.) and safety (e.g. contact with no-go zones, maximum applied force, etc.) were calculated. All users completed a post-simulation questionnaire related to realism, usefulness and perceived benefits of training on the MSESS. The MSESS was found to be realistic and useful for training surgical skills with scores of 7.97 ± 0.29 and 8.57 ± 0.69, respectively on a 10-point rating scale. Most students and residents (29/30) believed that it should be incorporated into their curriculum. There were significant differences between novice surgeons (10 medical students and 10 junior residents) and senior surgeons (10 senior residents and 3 sinus surgeons) in performance metrics related to quality (p < 0.05), efficiency (p < 0.01) and safety (p < 0.05). The MSESS demonstrated initial evidence supporting its use for residency education. This simulator may be a potential resource to help fill the void in endoscopic sinus surgery training.
NASA Astrophysics Data System (ADS)
Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.
2014-06-01
For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.
Use NU-WRF and GCE Model to Simulate the Precipitation Processes During MC3E Campaign
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Wu, Di; Matsui, Toshi; Li, Xiaowen; Zeng, Xiping; Peter-Lidard, Christa; Hou, Arthur
2012-01-01
One of major CRM approaches to studying precipitation processes is sometimes referred to as "cloud ensemble modeling". This approach allows many clouds of various sizes and stages of their lifecycles to be present at any given simulation time. Large-scale effects derived from observations are imposed into CRMs as forcing, and cyclic lateral boundaries are used. The advantage of this approach is that model results in terms of rainfall and QI and Q2 usually are in good agreement with observations. In addition, the model results provide cloud statistics that represent different types of clouds/cloud systems during their lifetime (life cycle). The large-scale forcing derived from MC3EI will be used to drive GCE model simulations. The model-simulated results will be compared with observations from MC3E. These GCE model-simulated datasets are especially valuable for LH algorithm developers. In addition, the regional scale model with very high-resolution, NASA Unified WRF is also used to real time forecast during the MC3E campaign to ensure that the precipitation and other meteorological forecasts are available to the flight planning team and to interpret the forecast results in terms of proposed flight scenarios. Post Mission simulations are conducted to examine the sensitivity of initial and lateral boundary conditions to cloud and precipitation processes and rainfall. We will compare model results in terms of precipitation and surface rainfall using GCE model and NU-WRF
Mission Options Scoping Tool for Mars Orbiters: Mass Cost Calculator (MC2)
NASA Technical Reports Server (NTRS)
Sturm, Eric J., II; Deutsch, Marie-Jose; Harmon, Corey; Nakagawa, Roy; Kinsey, Robert; Lopez, Nino; Kudrle, Paul; Evans, Alex
2007-01-01
Prior to developing the details of an advanced mission study, the mission architecture trade space is typically explored to assess the scope of feasible options. This paper describes the main features of an Excel-based tool, called the Mass-Cost-Calculator (MC2 ), which is used to perform rapid, high-level mass and cost options analyses of Mars orbiter missions. MC2 consists of a combination of databases, analytical solutions, and parametric relationships to enable quick evaluation of new mission concepts and comparison of multiple architecture options. The tool's outputs provide program management and planning teams with answers to "what if" queries, as well as an understanding of the driving mission elements, during the pre-project planning phase. These outputs have been validated against the outputs generated by the Advanced Projects Design Team (Team X) at NASA's Jet Propulsion Laboratory (JPL). The architecture of the tool allows for future expansion to other orbiters beyond Mars, and to non-orbiter missions, such as those involving fly-by spacecraft, probes, landers, rovers, or other mission elements.
Astronaut William S. McArthur in training for contingency EVA in WETF
NASA Technical Reports Server (NTRS)
1993-01-01
Astronaut William S. McArthur, mission specialist, participates in training for contingency extravehicular activity (EVA) for the STS-58 mission. He is wearing the extravehicular mobility unit (EMU) minus his helmet. For simulation purposes, McArthur was about to be submerged to a point of neutral buoyancy in the JSC Weightless Environment Training Facility (WETF).
Poster — Thur Eve — 47: Monte Carlo Simulation of Scp, Sc and Sp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhan, Lixin; Jiang, Runqing; Osei, Ernest K.
The in-water output ratio (Scp), in-air output ratio (Sc), and phantom scattering factor (Sp) are important parameters for radiotherapy dose calculation. Experimentally, Scp is obtained by measuring the dose rate ratio in water phantom, and Sc the water Kerma rate ratio in air. There is no method that allows direct measurement of Sp. Monte Carlo (MC) method has been used to simulate Scp and Sc in literatures, similar to experimental setup, but no MC direct simulation of Sp available yet to the best of our knowledge. We propose in this report a method of performing direct MC simulation of Sp.more » Starting from the definition, we derived that Sp of a clinical photon beam can be approximated by the ratio of the dose rates contributed from the primary beam for a given field size to the reference field size. Since only the primary beam is used, any Linac head scattering should be excluded from the simulation, which can be realized by using the incident electron as a scoring parameter for MU. We performed MC simulations for Scp, Sc and Sp. Scp matches well with golden beam data. Sp obtained by the proposed method agrees well with what is obtained using the traditional method, Sp=Scp/Sc. Since the smaller the field size, the more the primary beam dominates, our Sp simulation method is accurate for small field. By analyzing the calculated data, we found that this method can be used with no problem for large fields. The difference it introduced is clinically insignificant.« less
NASA Astrophysics Data System (ADS)
Zhang, Guannan; Del-Castillo-Negrete, Diego
2017-10-01
Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bootsma, G. J., E-mail: Gregory.Bootsma@rmp.uhn.on.ca; Verhaegen, F.; Medical Physics Unit, Department of Oncology, McGill University, Montreal, Quebec H3G 1A4
2015-01-15
Purpose: X-ray scatter is a significant impediment to image quality improvements in cone-beam CT (CBCT). The authors present and demonstrate a novel scatter correction algorithm using a scatter estimation method that simultaneously combines multiple Monte Carlo (MC) CBCT simulations through the use of a concurrently evaluated fitting function, referred to as concurrent MC fitting (CMCF). Methods: The CMCF method uses concurrently run MC CBCT scatter projection simulations that are a subset of the projection angles used in the projection set, P, to be corrected. The scattered photons reaching the detector in each MC simulation are simultaneously aggregated by an algorithmmore » which computes the scatter detector response, S{sub MC}. S{sub MC} is fit to a function, S{sub F}, and if the fit of S{sub F} is within a specified goodness of fit (GOF), the simulations are terminated. The fit, S{sub F}, is then used to interpolate the scatter distribution over all pixel locations for every projection angle in the set P. The CMCF algorithm was tested using a frequency limited sum of sines and cosines as the fitting function on both simulated and measured data. The simulated data consisted of an anthropomorphic head and a pelvis phantom created from CT data, simulated with and without the use of a compensator. The measured data were a pelvis scan of a phantom and patient taken on an Elekta Synergy platform. The simulated data were used to evaluate various GOF metrics as well as determine a suitable fitness value. The simulated data were also used to quantitatively evaluate the image quality improvements provided by the CMCF method. A qualitative analysis was performed on the measured data by comparing the CMCF scatter corrected reconstruction to the original uncorrected and corrected by a constant scatter correction reconstruction, as well as a reconstruction created using a set of projections taken with a small cone angle. Results: Pearson’s correlation, r, proved to be a suitable GOF metric with strong correlation with the actual error of the scatter fit, S{sub F}. Fitting the scatter distribution to a limited sum of sine and cosine functions using a low-pass filtered fast Fourier transform provided a computationally efficient and accurate fit. The CMCF algorithm reduces the number of photon histories required by over four orders of magnitude. The simulated experiments showed that using a compensator reduced the computational time by a factor between 1.5 and 1.75. The scatter estimates for the simulated and measured data were computed between 35–93 s and 114–122 s, respectively, using 16 Intel Xeon cores (3.0 GHz). The CMCF scatter correction improved the contrast-to-noise ratio by 10%–50% and reduced the reconstruction error to under 3% for the simulated phantoms. Conclusions: The novel CMCF algorithm significantly reduces the computation time required to estimate the scatter distribution by reducing the statistical noise in the MC scatter estimate and limiting the number of projection angles that must be simulated. Using the scatter estimate provided by the CMCF algorithm to correct both simulated and real projection data showed improved reconstruction image quality.« less
Surface vacancies concentration of CeO2(1 1 1) using kinetic Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Mattiello, S.; Kolling, S.; Heiliger, C.
2016-01-01
Kinetic Monte Carlo simulations (kMC) are useful tools for the investigation of the dynamics of surface properties. Within this method we investigate the oxygen vacancy concentration of \\text{Ce}{{\\text{O}}2}(1 1 1) at ultra high vacuum conditions (UHV). In order to achieve first principles calculations the input for the simulations, i.e. energy barriers for the microscopic processes, we use density functional theory (DFT) results from literature. We investigate the possibility of ad- and desorption of oxygen on ceria as well as the diffusion of oxygen vacancies to and from the subsurface. In particular, we focus on the vacancy surface concentration as well as on the ratio of the number of subsurface vacancies to the number of vacancies at the surface. The comparison of our dynamically obtained results to the experimental findings leads to several issues. In conclusion, we can claim a substantial incompatibility of the experimental results and the dynamical calculation using DFT inputs.
Next-generation acceleration and code optimization for light transport in turbid media using GPUs
Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar
2010-01-01
A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498
Oxidation of a new Biogenic VOC: Chamber Studies of the Atmospheric Chemistry of Methyl Chavicol
NASA Astrophysics Data System (ADS)
Bloss, William; Alam, Mohammed; Adbul Raheem, Modinah; Rickard, Andrew; Hamilton, Jacqui; Pereira, Kelly; Camredon, Marie; Munoz, Amalia; Vazquez, Monica; Vera, Teresa; Rodenas, Mila
2013-04-01
The oxidation of volatile organic compounds (VOCs) leads to formation of ozone and SOA, with consequences for air quality, health, crop yields, atmospheric chemistry and radiative transfer. Recent observations have identified Methyl Chavicol ("MC": Estragole; 1-allyl-4-methoxybenzene, C10H12O) as a major BVOC above pine forests in the USA, and oil palm plantations in Malaysian Borneo. Palm oil cultivation, and hence MC emissions, may be expected to increase with societal food and bio fuel demand. We present the results of a series of simulation chamber experiments to assess the atmospheric fate of MC. Experiments were performed in the EUPHORE facility, monitoring stable product species, radical intermediates, and aerosol production and composition. We determine rate constants for reaction of MC with OH and O3, and ozonolysis radical yields. Stable product measurements (FTIR, PTRMS, GC-SPME) are used to determine the yields of stable products formed from OH- and O3- initiated oxidation, and to develop an understanding of the initial stages of the MC degradation chemistry. A surrogate mechanism approach is used to simulate MC degradation within the MCM, evaluated in terms of ozone production measured in the chamber experiments, and applied to quantify the role of MC in the real atmosphere.
Automated Testability Decision Tool
1991-09-01
Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business
Monte Carlo simulation of the resolution volume for the SEQUOIA spectrometer
NASA Astrophysics Data System (ADS)
Granroth, G. E.; Hahn, S. E.
2015-01-01
Monte Carlo ray tracing simulations, of direct geometry spectrometers, have been particularly useful in instrument design and characterization. However, these tools can also be useful for experiment planning and analysis. To this end, the McStas Monte Carlo ray tracing model of SEQUOIA, the fine resolution fermi chopper spectrometer at the Spallation Neutron Source (SNS) of Oak Ridge National Laboratory (ORNL), has been modified to include the time of flight resolution sample and detector components. With these components, the resolution ellipsoid can be calculated for any detector pixel and energy bin of the instrument. The simulation is split in two pieces. First, the incident beamline up to the sample is simulated for 1 × 1011 neutron packets (4 days on 30 cores). This provides a virtual source for the backend that includes the resolution sample and monitor components. Next, a series of detector and energy pixels are computed in parallel. It takes on the order of 30 s to calculate a single resolution ellipsoid on a single core. Python scripts have been written to transform the ellipsoid into the space of an oriented single crystal, and to characterize the ellipsoid in various ways. Though this tool is under development as a planning tool, we have successfully used it to provide the resolution function for convolution with theoretical models. Specifically, theoretical calculations of the spin waves in YFeO3 were compared to measurements taken on SEQUOIA. Though the overall features of the spectra can be explained while neglecting resolution effects, the variation in intensity of the modes is well described once the resolution is included. As this was a single sharp mode, the simulated half intensity value of the resolution ellipsoid was used to provide the resolution width. A description of the simulation, its use, and paths forward for this technique will be discussed.
A Comprehensive Study of Three Delay Compensation Algorithms for Flight Simulators
NASA Technical Reports Server (NTRS)
Guo, Liwen; Cardullo, Frank M.; Houck, Jacob A.; Kelly, Lon C.; Wolters, Thomas E.
2005-01-01
This paper summarizes a comprehensive study of three predictors used for compensating the transport delay in a flight simulator; The McFarland, Adaptive and State Space Predictors. The paper presents proof that the stochastic approximation algorithm can achieve the best compensation among all four adaptive predictors, and intensively investigates the relationship between the state space predictor s compensation quality and its reference model. Piloted simulation tests show that the adaptive predictor and state space predictor can achieve better compensation of transport delay than the McFarland predictor.
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
This paper looks closely at each of the software metrics generated by the McCabe object-Oriented Tool(TM) and its ability to convey timely information to developers. The metrics are examined for meaningfulness in terms of the scale assignable to the metric by the rules of measurement theory and the software dimension being measured. Recommendations are made as to the proper use of each metric and its ability to influence development at an early stage. The metrics of the McCabe Object-Oriented Tool(TM) set were selected because of the tool's use in a couple of NASA IV&V projects.
Constant-pH Molecular Dynamics Simulations for Large Biomolecular Systems
Radak, Brian K.; Chipot, Christophe; Suh, Donghyuk; ...
2017-11-07
We report that an increasingly important endeavor is to develop computational strategies that enable molecular dynamics (MD) simulations of biomolecular systems with spontaneous changes in protonation states under conditions of constant pH. The present work describes our efforts to implement the powerful constant-pH MD simulation method, based on a hybrid nonequilibrium MD/Monte Carlo (neMD/MC) technique within the highly scalable program NAMD. The constant-pH hybrid neMD/MC method has several appealing features; it samples the correct semigrand canonical ensemble rigorously, the computational cost increases linearly with the number of titratable sites, and it is applicable to explicit solvent simulations. The present implementationmore » of the constant-pH hybrid neMD/MC in NAMD is designed to handle a wide range of biomolecular systems with no constraints on the choice of force field. Furthermore, the sampling efficiency can be adaptively improved on-the-fly by adjusting algorithmic parameters during the simulation. Finally, illustrative examples emphasizing medium- and large-scale applications on next-generation supercomputing architectures are provided.« less
Constant-pH Molecular Dynamics Simulations for Large Biomolecular Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radak, Brian K.; Chipot, Christophe; Suh, Donghyuk
We report that an increasingly important endeavor is to develop computational strategies that enable molecular dynamics (MD) simulations of biomolecular systems with spontaneous changes in protonation states under conditions of constant pH. The present work describes our efforts to implement the powerful constant-pH MD simulation method, based on a hybrid nonequilibrium MD/Monte Carlo (neMD/MC) technique within the highly scalable program NAMD. The constant-pH hybrid neMD/MC method has several appealing features; it samples the correct semigrand canonical ensemble rigorously, the computational cost increases linearly with the number of titratable sites, and it is applicable to explicit solvent simulations. The present implementationmore » of the constant-pH hybrid neMD/MC in NAMD is designed to handle a wide range of biomolecular systems with no constraints on the choice of force field. Furthermore, the sampling efficiency can be adaptively improved on-the-fly by adjusting algorithmic parameters during the simulation. Finally, illustrative examples emphasizing medium- and large-scale applications on next-generation supercomputing architectures are provided.« less
Game of Life on the Equal Degree Random Lattice
NASA Astrophysics Data System (ADS)
Shao, Zhi-Gang; Chen, Tao
2010-12-01
An effective matrix method is performed to build the equal degree random (EDR) lattice, and then a cellular automaton game of life on the EDR lattice is studied by Monte Carlo (MC) simulation. The standard mean field approximation (MFA) is applied, and then the density of live cells is given ρ=0.37017 by MFA, which is consistent with the result ρ=0.37±0.003 by MC simulation.
Yeo, Sang Chul; Lo, Yu Chieh; Li, Ju; Lee, Hyuck Mo
2014-10-07
Ammonia (NH3) nitridation on an Fe surface was studied by combining density functional theory (DFT) and kinetic Monte Carlo (kMC) calculations. A DFT calculation was performed to obtain the energy barriers (Eb) of the relevant elementary processes. The full mechanism of the exact reaction path was divided into five steps (adsorption, dissociation, surface migration, penetration, and diffusion) on an Fe (100) surface pre-covered with nitrogen. The energy barrier (Eb) depended on the N surface coverage. The DFT results were subsequently employed as a database for the kMC simulations. We then evaluated the NH3 nitridation rate on the N pre-covered Fe surface. To determine the conditions necessary for a rapid NH3 nitridation rate, the eight reaction events were considered in the kMC simulations: adsorption, desorption, dissociation, reverse dissociation, surface migration, penetration, reverse penetration, and diffusion. This study provides a real-time-scale simulation of NH3 nitridation influenced by nitrogen surface coverage that allowed us to theoretically determine a nitrogen coverage (0.56 ML) suitable for rapid NH3 nitridation. In this way, we were able to reveal the coverage dependence of the nitridation reaction using the combined DFT and kMC simulations.
Validation of Shielding Analysis Capability of SuperMC with SINBAD
NASA Astrophysics Data System (ADS)
Chen, Chaobin; Yang, Qi; Wu, Bin; Han, Yuncheng; Song, Jing
2017-09-01
Abstract: The shielding analysis capability of SuperMC was validated with the Shielding Integral Benchmark Archive Database (SINBAD). The SINBAD was compiled by RSICC and NEA, it includes numerous benchmark experiments performed with the D-T fusion neutron source facilities of OKTAVIAN, FNS, IPPE, etc. The results from SuperMC simulation were compared with experimental data and MCNP results. Very good agreement with deviation lower than 1% was achieved and it suggests that SuperMC is reliable in shielding calculation.
NASA Astrophysics Data System (ADS)
Rossetto, Rudy; De Filippis, Giovanna; Borsi, Iacopo; Foglia, Laura; Toegl, Anja; Cannata, Massimiliano; Neumann, Jakob; Vazquez-Sune, Enric; Criollo, Rotman
2017-04-01
In order to achieve sustainable and participated ground-water management, innovative software built on the integration of numerical models within GIS software is a perfect candidate to provide a full characterization of quantitative and qualitative aspects of ground- and surface-water resources maintaining the time and spatial dimension. The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management; Rossetto et al., 2015) aims at simplifying the application of EU water-related Directives through an open-source and public-domain, GIS-integrated simulation platform for planning and management of ground- and surface-water resources. The FREEWAT platform allows to simulate the whole hydrological cycle, coupling the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. This results in a modeling environment where large spatial datasets can be stored, managed and visualized and where several simulation codes (mainly belonging to the USGS MODFLOW family) are integrated to simulate multiple hydrological, hydrochemical or economic processes. So far, the FREEWAT platform is a large plugin for the QGIS GIS desktop software and it integrates the following capabilities: • the AkvaGIS module allows to produce plots and statistics for the analysis and interpretation of hydrochemical and hydrogeological data; • the Observation Analysis Tool, to facilitate the import, analysis and visualization of time-series data and the use of these data to support model construction and calibration; • groundwater flow simulation in the saturated and unsaturated zones may be simulated using MODFLOW-2005 (Harbaugh, 2005); • multi-species advective-dispersive transport in the saturated zone can be simulated using MT3DMS (Zheng & Wang, 1999); the possibility to simulate viscosity- and density-dependent flows is further accomplished through SEAWAT (Langevin et al., 2007); • sustainable management of combined use of ground- and surface-water resources in rural environments is accomplished by the Farm Process module embedded in MODFLOW-OWHM (Hanson et al., 2014), which allows to dynamically integrate crop water demand and supply from ground- and surface-water; • UCODE_2014 (Poeter et al., 2014) is implemented to perform sensitivity analysis and parameter estimation to improve the model fit through an inverse, regression method based on the evaluation of an objective function. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT aims at enhancing science and participatory approach and evidence-based decision making in water resource management, hence producing relevant outcomes for policy implementation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's HORIZON 2020 research and innovation programme under Grant Agreement n. 642224. References Hanson, R.T., Boyce, S.E., Schmid, W., Hughes, J.D., Mehl, S.M., Leake, S.A., Maddock, T., Niswonger, R.G. One-Water Hydrologic Flow Model (MODFLOW-OWHM), U.S. Geological Survey, Techniques and Methods 6-A51, 2014 134 p. Harbaugh A.W. (2005) - MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - the Ground-Water Flow Process. U.S. Geological Survey, Techniques and Methods 6-A16, 253 p. Langevin C.D., Thorne D.T. Jr., Dausman A.M., Sukop M.C. & Guo Weixing (2007) - SEAWAT Version 4: A Computer Program for Simulation of Multi-Species Solute and Heat Transport. U.S. Geological Survey Techniques and Methods 6-A22, 39 pp. Poeter E.P., Hill M.C., Lu D., Tiedeman C.R. & Mehl S. (2014) - UCODE_2014, with new capabilities to define parameters unique to predictions, calculate weights using simulated values, estimate parameters with SVD, evaluate uncertainty with MCMC, and more. Integrated Groundwater Modeling Center Report Number GWMI 2014-02. Rossetto, R., Borsi, I. & Foglia, L. FREEWAT: FREE and open source software tools for WATer resource management, Rendiconti Online Società Geologica Italiana, 2015, 35, 252-255. Zheng C. & Wang P.P. (1999) - MT3DMS, A modular three-dimensional multi-species transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. U.S. Army Engineer Research and Development Center Contract Report SERDP-99-1, Vicksburg, MS, 202 pp.
Influence of photon energy cuts on PET Monte Carlo simulation results.
Mitev, Krasimir; Gerganov, Georgi; Kirov, Assen S; Schmidtlein, C Ross; Madzhunkov, Yordan; Kawrakow, Iwan
2012-07-01
The purpose of this work is to study the influence of photon energy cuts on the results of positron emission tomography (PET) Monte Carlo (MC) simulations. MC simulations of PET scans of a box phantom and the NEMA image quality phantom are performed for 32 photon energy cut values in the interval 0.3-350 keV using a well-validated numerical model of a PET scanner. The simulations are performed with two MC codes, egs_pet and GEANT4 Application for Tomographic Emission (GATE). The effect of photon energy cuts on the recorded number of singles, primary, scattered, random, and total coincidences as well as on the simulation time and noise-equivalent count rate is evaluated by comparing the results for higher cuts to those for 1 keV cut. To evaluate the effect of cuts on the quality of reconstructed images, MC generated sinograms of PET scans of the NEMA image quality phantom are reconstructed with iterative statistical reconstruction. The effects of photon cuts on the contrast recovery coefficients and on the comparison of images by means of commonly used similarity measures are studied. For the scanner investigated in this study, which uses bismuth germanate crystals, the transport of Bi X(K) rays must be simulated in order to obtain unbiased estimates for the number of singles, true, scattered, and random coincidences as well as for an unbiased estimate of the noise-equivalent count rate. Photon energy cuts higher than 170 keV lead to absorption of Compton scattered photons and strongly increase the number of recorded coincidences of all types and the noise-equivalent count rate. The effect of photon cuts on the reconstructed images and the similarity measures used for their comparison is statistically significant for very high cuts (e.g., 350 keV). The simulation time decreases slowly with the increase of the photon cut. The simulation of the transport of characteristic x rays plays an important role, if an accurate modeling of a PET scanner system is to be achieved. The simulation time decreases slowly with the increase of the cut which, combined with the accuracy loss at high cuts, means that the usage of high photon energy cuts is not recommended for the acceleration of MC simulations.
Thomson, R; Kawrakow, I
2012-06-01
Widely-used classical trajectory Monte Carlo simulations of low energy electron transport neglect the quantum nature of electrons; however, at sub-1 keV energies quantum effects have the potential to become significant. This work compares quantum and classical simulations within a simplified model of electron transport in water. Electron transport is modeled in water droplets using quantum mechanical (QM) and classical trajectory Monte Carlo (MC) methods. Water droplets are modeled as collections of point scatterers representing water molecules from which electrons may be isotropically scattered. The role of inelastic scattering is investigated by introducing absorption. QM calculations involve numerically solving a system of coupled equations for the electron wavefield incident on each scatterer. A minimum distance between scatterers is introduced to approximate structured water. The average QM water droplet incoherent cross section is compared with the MC cross section; a relative error (RE) on the MC results is computed. RE varies with electron energy, average and minimum distances between scatterers, and scattering amplitude. The mean free path is generally the relevant length scale for estimating RE. The introduction of a minimum distance between scatterers increases RE substantially (factors of 5 to 10), suggesting that the structure of water must be modeled for accurate simulations. Inelastic scattering does not improve agreement between QM and MC simulations: for the same magnitude of elastic scattering, the introduction of inelastic scattering increases RE. Droplet cross sections are sensitive to droplet size and shape; considerable variations in RE are observed with changing droplet size and shape. At sub-1 keV energies, quantum effects may become non-negligible for electron transport in condensed media. Electron transport is strongly affected by the structure of the medium. Inelastic scatter does not improve agreement between QM and MC simulations of low energy electron transport in condensed media. © 2012 American Association of Physicists in Medicine.
The Man computer Interactive Data Access System: 25 Years of Interactive Processing.
NASA Astrophysics Data System (ADS)
Lazzara, Matthew A.; Benson, John M.; Fox, Robert J.; Laitsch, Denise J.; Rueden, Joseph P.; Santek, David A.; Wade, Delores M.; Whittaker, Thomas M.; Young, J. T.
1999-02-01
On 12 October 1998, it was the 25th anniversary of the Man computer Interactive Data Access System (McIDAS). On that date in 1973, McIDAS was first used operationally by scientists as a tool for data analysis. Over the last 25 years, McIDAS has undergone numerous architectural changes in an effort to keep pace with changing technology. In its early years, significant technological breakthroughs were required to achieve the functionality needed by atmospheric scientists. Today McIDAS is challenged by new Internet-based approaches to data access and data display. The history and impact of McIDAS, along with some of the lessons learned, are presented here
The Multi-Step CADIS method for shutdown dose rate calculations and uncertainty propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Ahmad M.; Peplow, Douglas E.; Grove, Robert E.
2015-12-01
Shutdown dose rate (SDDR) analysis requires (a) a neutron transport calculation to estimate neutron flux fields, (b) an activation calculation to compute radionuclide inventories and associated photon sources, and (c) a photon transport calculation to estimate final SDDR. In some applications, accurate full-scale Monte Carlo (MC) SDDR simulations are needed for very large systems with massive amounts of shielding materials. However, these simulations are impractical because calculation of space- and energy-dependent neutron fluxes throughout the structural materials is needed to estimate distribution of radioisotopes causing the SDDR. Biasing the neutron MC calculation using an importance function is not simple becausemore » it is difficult to explicitly express the response function, which depends on subsequent computational steps. Furthermore, the typical SDDR calculations do not consider how uncertainties in MC neutron calculation impact SDDR uncertainty, even though MC neutron calculation uncertainties usually dominate SDDR uncertainty.« less
Absolute dose calculations for Monte Carlo simulations of radiotherapy beams.
Popescu, I A; Shaw, C P; Zavgorodni, S F; Beckham, W A
2005-07-21
Monte Carlo (MC) simulations have traditionally been used for single field relative comparisons with experimental data or commercial treatment planning systems (TPS). However, clinical treatment plans commonly involve more than one field. Since the contribution of each field must be accurately quantified, multiple field MC simulations are only possible by employing absolute dosimetry. Therefore, we have developed a rigorous calibration method that allows the incorporation of monitor units (MU) in MC simulations. This absolute dosimetry formalism can be easily implemented by any BEAMnrc/DOSXYZnrc user, and applies to any configuration of open and blocked fields, including intensity-modulated radiation therapy (IMRT) plans. Our approach involves the relationship between the dose scored in the monitor ionization chamber of a radiotherapy linear accelerator (linac), the number of initial particles incident on the target, and the field size. We found that for a 10 x 10 cm2 field of a 6 MV photon beam, 1 MU corresponds, in our model, to 8.129 x 10(13) +/- 1.0% electrons incident on the target and a total dose of 20.87 cGy +/- 1.0% in the monitor chambers of the virtual linac. We present an extensive experimental verification of our MC results for open and intensity-modulated fields, including a dynamic 7-field IMRT plan simulated on the CT data sets of a cylindrical phantom and of a Rando anthropomorphic phantom, which were validated by measurements using ionization chambers and thermoluminescent dosimeters (TLD). Our simulation results are in excellent agreement with experiment, with percentage differences of less than 2%, in general, demonstrating the accuracy of our Monte Carlo absolute dose calculations.
Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes
NASA Astrophysics Data System (ADS)
Aghara, S. K.; Sriprisan, S. I.; Singleterry, R. C.; Sato, T.
2015-01-01
Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm2 Al shield followed by 30 g/cm2 of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E < 100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results.
NASA Astrophysics Data System (ADS)
Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.
2017-05-01
Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.
NASA Astrophysics Data System (ADS)
Adam, Khaled; Zöllner, Dana; Field, David P.
2018-04-01
Modeling the microstructural evolution during recrystallization is a powerful tool for the profound understanding of alloy behavior and for use in optimizing engineering properties through annealing. In particular, the mechanical properties of metallic alloys are highly dependent upon evolved microstructure and texture from the softening process. In the present work, a Monte Carlo (MC) Potts model was used to model the primary recrystallization and grain growth in cold rolled single-phase Al alloy. The microstructural representation of two kinds of dislocation densities, statistically stored dislocations and geometrically necessary dislocations were quantified based on the ViscoPlastic Fast Fourier transform method. This representation was then introduced into the MC Potts model to identify the favorable sites for nucleation where orientation gradients and entanglements of dislocations are high. Additionally, in situ observations of non-isothermal microstructure evolution for single-phase aluminum alloy 1100 were made to validate the simulation. The influence of the texture inhomogeneity is analyzed from a theoretical point of view using an orientation distribution function for deformed and evolved texture.
NASA Astrophysics Data System (ADS)
Underwood, T. S. A.; Sung, W.; McFadden, C. H.; McMahon, S. J.; Hall, D. C.; McNamara, A. L.; Paganetti, H.; Sawakuchi, G. O.; Schuemann, J.
2017-04-01
Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al2O3:C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated 8× 4× 0.5 mm3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al2O3:C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, \\overline{{{y}F}} , both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al2O3:C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.
Underwood, T S A; Sung, W; McFadden, C H; McMahon, S J; Hall, D C; McNamara, A L; Paganetti, H; Sawakuchi, G O; Schuemann, J
2017-04-21
Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al 2 O 3 :C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated [Formula: see text] mm 3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al 2 O 3 :C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, [Formula: see text], both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al 2 O 3 :C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.
A Comparison of Experimental EPMA Data and Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Carpenter, P. K.
2004-01-01
Monte Carlo (MC) modeling shows excellent prospects for simulating electron scattering and x-ray emission from complex geometries, and can be compared to experimental measurements using electron-probe microanalysis (EPMA) and phi(rho z) correction algorithms. Experimental EPMA measurements made on NIST SRM 481 (AgAu) and 482 (CuAu) alloys, at a range of accelerating potential and instrument take-off angles, represent a formal microanalysis data set that has been used to develop phi(rho z) correction algorithms. The accuracy of MC calculations obtained using the NIST, WinCasino, WinXray, and Penelope MC packages will be evaluated relative to these experimental data. There is additional information contained in the extended abstract.
Measuring Virtual Simulations Value in Training Exercises - USMC Use Case
2015-12-04
and cost avoidance and Capt Jonathan Richardson, PM TRASYS, who was the primary author for the After-Action Documentation and Analysis Report ...REFERENCES Cermak J. & McGurk M. (2010, July). Putting a Value On Training. McKinsey Quarterly. Retrieved June 10, 2015 from http://www.mckinsey.com...www.hqmc.marines.mil/Portals/142/Docs/2015CPG_Color.pdf Gordon, S. & Cooley, T. (2013) Phase One Final Report : Cost Avoidance Study of USMC Simulation
NASA Astrophysics Data System (ADS)
Lépinoux, J.; Sigli, C.
2018-01-01
In a recent paper, the authors showed how the clusters free energies are constrained by the coagulation probability, and explained various anomalies observed during the precipitation kinetics in concentrated alloys. This coagulation probability appeared to be a too complex function to be accurately predicted knowing only the cluster distribution in Cluster Dynamics (CD). Using atomistic Monte Carlo (MC) simulations, it is shown that during a transformation at constant temperature, after a short transient regime, the transformation occurs at quasi-equilibrium. It is proposed to use MC simulations until the system quasi-equilibrates then to switch to CD which is mean field but not limited by a box size like MC. In this paper, we explain how to take into account the information available before the quasi-equilibrium state to establish guidelines to safely predict the cluster free energies.
Simulation of computed tomography dose based on voxel phantom
NASA Astrophysics Data System (ADS)
Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun
2017-01-01
Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.
NASA Astrophysics Data System (ADS)
Katsoulakis, Markos A.; Vlachos, Dionisios G.
2003-11-01
We derive a hierarchy of successively coarse-grained stochastic processes and associated coarse-grained Monte Carlo (CGMC) algorithms directly from the microscopic processes as approximations in larger length scales for the case of diffusion of interacting particles on a lattice. This hierarchy of models spans length scales between microscopic and mesoscopic, satisfies a detailed balance, and gives self-consistent fluctuation mechanisms whose noise is asymptotically identical to the microscopic MC. Rigorous, detailed asymptotics justify and clarify these connections. Gradient continuous time microscopic MC and CGMC simulations are compared under far from equilibrium conditions to illustrate the validity of our theory and delineate the errors obtained by rigorous asymptotics. Information theory estimates are employed for the first time to provide rigorous error estimates between the solutions of microscopic MC and CGMC, describing the loss of information during the coarse-graining process. Simulations under periodic boundary conditions are used to verify the information theory error estimates. It is shown that coarse-graining in space leads also to coarse-graining in time by q2, where q is the level of coarse-graining, and overcomes in part the hydrodynamic slowdown. Operation counting and CGMC simulations demonstrate significant CPU savings in continuous time MC simulations that vary from q3 for short potentials to q4 for long potentials. Finally, connections of the new coarse-grained stochastic processes to stochastic mesoscopic and Cahn-Hilliard-Cook models are made.
Monte Carlo decision curve analysis using aggregate data.
Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin
2017-02-01
Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Forward to the Future: Estimating River Discharge with McFLI
NASA Astrophysics Data System (ADS)
Gleason, C. J.; Durand, M. T.; Garambois, P. A.
2016-12-01
The global surface water budget is still poorly understood, and improving our understanding of freshwater budgets requires coordination between in situ observations, models, and remote sensing. The upcoming launch of the NASA/CNES Surface Water and Ocean Topography (SWOT) satellite has generated considerable excitement as a new tool enabling hydrologists to tackle some of the most pressing questions facing their discipline. One question in particular which SWOT seems well suited to answer is river discharge (flow rate) estimation in ungauged basins: SWOT's anticipated measurements of river surface height and area have ushered in a new technique in hydrology- what we are here calling Mass conserved Flow Law Inversions, or McFLI. McFLI algorithms leverage classic hydraulic flow expressions (e.g. Manning's Equation, hydraulic geometry) within mass conserved river reaches to construct a simplified but still underconstrained system of equations to be solved for an unknown discharge. Most existing McFLI techniques have been designed to take advantage of SWOT's measurements and Manning's Equation: SWOT will observe changes in cross sectional area and river surface slope over time, so the McFLI need only solve for baseflow area and Manning's roughness coefficient. Recently published preliminary results have indicated that McFLI can be a viable tool in a global hydrologist's toolbox (discharge errors less than 30% as compared to gauges are possible in most cases). Therefore, we here outline the progress to date for McFLI techniques, and highlight three key areas for future development: 1) Maximize the accuracy and robustness of McFLI by incorporating ancillary data from satellites, models, and in situ observations. 2) Develop new McFLI techniques using novel or underutilized flow laws. 3) Systematically test McFLI to define different inversion classes of rivers with well-defined error budgets based on geography and available data for use in gauged and ungauged basins alike.
Stand-Alone Measurements and Characterization | Photovoltaic Research |
Science and Technology Facility cluster tools offer powerful capabilities for measuring and characterizing Characterization tool suite are supplemented by the Integrated Measurements and Characterization cluster tool the Integrated M&C cluster tool using a mobile transport pod, which can keep samples under vacuum
TH-A-18C-04: Ultrafast Cone-Beam CT Scatter Correction with GPU-Based Monte Carlo Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Y; Southern Medical University, Guangzhou; Bai, T
2014-06-15
Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT). We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC) simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstructions within 30 seconds. Methods: The method consists of six steps: 1) FDK reconstruction using raw projection data; 2) Rigid Registration of planning CT to the FDK results; 3) MC scatter calculation at sparse view angles using the planning CT; 4) Interpolation of the calculated scatter signals to other angles; 5) Removal of scatter from the raw projections;more » 6) FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC scatter noise caused by low photon numbers. The method is validated on head-and-neck cases with simulated and clinical data. Results: We have studied impacts of photo histories, volume down sampling factors on the accuracy of scatter estimation. The Fourier analysis was conducted to show that scatter images calculated at 31 angles are sufficient to restore those at all angles with <0.1% error. For the simulated case with a resolution of 512×512×100, we simulated 10M photons per angle. The total computation time is 23.77 seconds on a Nvidia GTX Titan GPU. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU. Similar results were found for a real patient case. Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. The whole process of scatter correction and reconstruction is accomplished within 30 seconds. This study is supported in part by NIH (1R01CA154747-01), The Core Technology Research in Strategic Emerging Industry, Guangdong, China (2011A081402003)« less
Bhaskaran, Abhishek; Barry, M A Tony; Al Raisi, Sara I; Chik, William; Nguyen, Doan Trang; Pouliopoulos, Jim; Nalliah, Chrishan; Hendricks, Roger; Thomas, Stuart; McEwan, Alistair L; Kovoor, Pramesh; Thiagalingam, Aravinda
2015-10-01
Magnetic navigation system (MNS) ablation was suspected to be less effective and unstable in highly mobile cardiac regions compared to radiofrequency (RF) ablations with manual control (MC). The aim of the study was to compare the (1) lesion size and (2) stability of MNS versus MC during irrigated RF ablation with and without simulated mechanical heart wall motion. In a previously validated myocardial phantom, the performance of Navistar RMT Thermocool catheter (Biosense Webster, CA, USA) guided with MNS was compared to manually controlled Navistar irrigated Thermocool catheter (Biosense Webster, CA, USA). The lesion dimensions were compared with the catheter in inferior and superior orientation, with and without 6-mm simulated wall motion. All ablations were performed with 40 W power and 30 ml/ min irrigation for 60 s. A total of 60 ablations were performed. The mean lesion volumes with MNS and MC were 57.5 ± 7.1 and 58.1 ± 7.1 mm(3), respectively, in the inferior catheter orientation (n = 23, p = 0.6), 62.8 ± 9.9 and 64.6 ± 7.6 mm(3), respectively, in the superior catheter orientation (n = 16, p = 0.9). With 6-mm simulated wall motion, the mean lesion volumes with MNS and MC were 60.2 ± 2.7 and 42.8 ± 8.4 mm(3), respectively, in the inferior catheter orientation (n = 11, p = <0.01*), 74.1 ± 5.8 and 54.2 ± 3.7 mm(3), respectively, in the superior catheter orientation (n = 10, p = <0.01*). During 6-mm simulated wall motion, the MC catheter and MNS catheter moved 5.2 ± 0.1 and 0 mm, respectively, in inferior orientation and 5.5 ± 0.1 and 0 mm, respectively, in the superior orientation on the ablation surface. The lesion dimensions were larger with MNS compared to MC in the presence of simulated wall motion, consistent with greater catheter stability. However, similar lesion dimensions were observed in the stationary model.
Wan Chan Tseung, H; Ma, J; Beltran, C
2015-06-01
Very fast Monte Carlo (MC) simulations of proton transport have been implemented recently on graphics processing units (GPUs). However, these MCs usually use simplified models for nonelastic proton-nucleus interactions. Our primary goal is to build a GPU-based proton transport MC with detailed modeling of elastic and nonelastic proton-nucleus collisions. Using the cuda framework, the authors implemented GPU kernels for the following tasks: (1) simulation of beam spots from our possible scanning nozzle configurations, (2) proton propagation through CT geometry, taking into account nuclear elastic scattering, multiple scattering, and energy loss straggling, (3) modeling of the intranuclear cascade stage of nonelastic interactions when they occur, (4) simulation of nuclear evaporation, and (5) statistical error estimates on the dose. To validate our MC, the authors performed (1) secondary particle yield calculations in proton collisions with therapeutically relevant nuclei, (2) dose calculations in homogeneous phantoms, (3) recalculations of complex head and neck treatment plans from a commercially available treatment planning system, and compared with (GEANT)4.9.6p2/TOPAS. Yields, energy, and angular distributions of secondaries from nonelastic collisions on various nuclei are in good agreement with the (GEANT)4.9.6p2 Bertini and Binary cascade models. The 3D-gamma pass rate at 2%-2 mm for treatment plan simulations is typically 98%. The net computational time on a NVIDIA GTX680 card, including all CPU-GPU data transfers, is ∼ 20 s for 1 × 10(7) proton histories. Our GPU-based MC is the first of its kind to include a detailed nuclear model to handle nonelastic interactions of protons with any nucleus. Dosimetric calculations are in very good agreement with (GEANT)4.9.6p2/TOPAS. Our MC is being integrated into a framework to perform fast routine clinical QA of pencil-beam based treatment plans, and is being used as the dose calculation engine in a clinically applicable MC-based IMPT treatment planning system. The detailed nuclear modeling will allow us to perform very fast linear energy transfer and neutron dose estimates on the GPU.
A novel Monte Carlo algorithm for simulating crystals with McStas
NASA Astrophysics Data System (ADS)
Alianelli, L.; Sánchez del Río, M.; Felici, R.; Andersen, K. H.; Farhi, E.
2004-07-01
We developed an original Monte Carlo algorithm for the simulation of Bragg diffraction by mosaic, bent and gradient crystals. It has practical applications, as it can be used for simulating imperfect crystals (monochromators, analyzers and perhaps samples) in neutron ray-tracing packages, like McStas. The code we describe here provides a detailed description of the particle interaction with the microscopic homogeneous regions composing the crystal, therefore it can be used also for the calculation of quantities having a conceptual interest, as multiple scattering, or for the interpretation of experiments aiming at characterizing crystals, like diffraction topographs.
NASA Astrophysics Data System (ADS)
Kamal Chowdhury, AFM; Lockart, Natalie; Willgoose, Garry; Kuczera, George; Kiem, Anthony; Parana Manage, Nadeeka
2016-04-01
Stochastic simulation of rainfall is often required in the simulation of streamflow and reservoir levels for water security assessment. As reservoir water levels generally vary on monthly to multi-year timescales, it is important that these rainfall series accurately simulate the multi-year variability. However, the underestimation of multi-year variability is a well-known issue in daily rainfall simulation. Focusing on this issue, we developed a hierarchical Markov Chain (MC) model in a traditional two-part MC-Gamma Distribution modelling structure, but with a new parameterization technique. We used two parameters of first-order MC process (transition probabilities of wet-to-wet and dry-to-dry days) to simulate the wet and dry days, and two parameters of Gamma distribution (mean and standard deviation of wet day rainfall) to simulate wet day rainfall depths. We found that use of deterministic Gamma parameter values results in underestimation of multi-year variability of rainfall depths. Therefore, we calculated the Gamma parameters for each month of each year from the observed data. Then, for each month, we fitted a multi-variate normal distribution to the calculated Gamma parameter values. In the model, we stochastically sampled these two Gamma parameters from the multi-variate normal distribution for each month of each year and used them to generate rainfall depth in wet days using the Gamma distribution. In another study, Mehrotra and Sharma (2007) proposed a semi-parametric Markov model. They also used a first-order MC process for rainfall occurrence simulation. But, the MC parameters were modified by using an additional factor to incorporate the multi-year variability. Generally, the additional factor is analytically derived from the rainfall over a pre-specified past periods (e.g. last 30, 180, or 360 days). They used a non-parametric kernel density process to simulate the wet day rainfall depths. In this study, we have compared the performance of our hierarchical MC model with the semi-parametric model in preserving rainfall variability in daily, monthly, and multi-year scales. To calibrate the parameters of both models and assess their ability to preserve observed statistics, we have used ground based data from 15 raingauge stations around Australia, which consist a wide range of climate zones including coastal, monsoonal, and arid climate characteristics. In preliminary results, both models show comparative performances in preserving the multi-year variability of rainfall depth and occurrence. However, the semi-parametric model shows a tendency of overestimating the mean rainfall depth, while our model shows a tendency of overestimating the number of wet days. We will discuss further the relative merits of the both models for hydrology simulation in the presentation.
Parallel processing implementation for the coupled transport of photons and electrons using OpenMP
NASA Astrophysics Data System (ADS)
Doerner, Edgardo
2016-05-01
In this work the use of OpenMP to implement the parallel processing of the Monte Carlo (MC) simulation of the coupled transport for photons and electrons is presented. This implementation was carried out using a modified EGSnrc platform which enables the use of the Microsoft Visual Studio 2013 (VS2013) environment, together with the developing tools available in the Intel Parallel Studio XE 2015 (XE2015). The performance study of this new implementation was carried out in a desktop PC with a multi-core CPU, taking as a reference the performance of the original platform. The results were satisfactory, both in terms of scalability as parallelization efficiency.
SU-G-JeP2-15: Proton Beam Behavior in the Presence of Realistic Magnet Fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santos, D M; Wachowicz, K; Fallone, B G
2016-06-15
Purpose: To investigate the effects of magnetic fields on proton therapy beams for integration with MRI. Methods: 3D magnetic fields from an open-bore superconducting MRI model (previously developed by our group) and 3D magnetic fields from an in-house gradient coil design were applied to various mono energetic proton pencil beam (80MeV to 250MeV) simulations. In all simulations, the z-axis of the simulation geometry coincided with the direction of the B0 field and magnet isocentre. In each simulation, the initial beam trajectory was varied. The first set of simulations performed was based on analytic magnetic force equations (analytic simulations), which couldmore » be rapidly calculated yet were limited to propagating proton beams in vacuum. The second set is full Monte Carlo (MC) simulations, which used GEANT4 MC toolkit. Metrics such as the beam position and dose profiles were extracted. Comparisons between the cases with and without magnetic fields present were made. Results: The analytic simulations served as verification checks for the MC simulations when the same simulation geometries were used. The results of the analytic simulations agreed with the MC simulations performed in vacuum. The presence of the MRI’s static magnetic field causes proton pencil beams to follow a slight helical trajectory when there were some initial off-axis components. The 80MeV, 150MeV, and 250MeV proton beams rotated by 4.9o, 3.6o, and 2.8o, respectively, when they reached z=0cm. The deflections caused by gradient coils’ magnetic fields show spatially invariant patterns with a maximum range of 0.5mm at z=0cm. Conclusion: This investigation reveals that both the MRI’s B0 and gradient magnetic fields can cause small but observable deflections of proton beams at energies studied. The MRI’s static field caused a rotation of the beam while the gradient coils’ fields effects were spatially invariant. Dr. B Gino Fallone is a co-founder and CEO of MagnetTx Oncology Solutions (under discussions to license Alberta bi-planar linac MR for commercialization)« less
Three Dimensional Spherical Display Systems and McIDAS: Tools for Science, Education and Outreach
NASA Astrophysics Data System (ADS)
Kohrs, R.; Mooney, M. E.
2010-12-01
The Space Science and Engineering Center (SSEC) and Cooperative Institute for Meteorological Satellite Studies (CIMSS) at the University of Wisconsin are now using a 3D spherical display system and their Man computer Data Access System (McIDAS)-X and McIDAS-V as outreach tools to demonstrate how scientists and forecasters utilize satellite imagery to monitor weather and climate. Our outreach program displays orbits and data coverage of geostationary and polar satellites and demonstrates how each is beneficial for the remote sensing of Earth. Global composites of visible, infrared and water vapor images illustrate how satellite instruments collect data from different bands of the electromagnetic spectrum to monitor global weather patterns 24 hours a day. Captivating animations on spherical display systems are proving to be much more intuitive than traditional 2D displays, enabling audiences to view satellites orbiting above real-time weather systems circulating the entire globe. Complimenting the 3D spherical display system are the UNIX-based McIDAS-X and Java-based McIDAS-V software packages. McIDAS is used to composite the real-time global satellite data and create other weather related derived products. Client and server techniques used by these software packages provide the opportunity to continually update the real-time content on our globe. The enhanced functionality of McIDAS-V extends our outreach program by allowing in-depth interactive 4-dimensional views of the imagery previously viewed on the 3D spherical display system. An important goal of our outreach program is the promotion of remote sensing research and technology at SSEC and CIMSS. The 3D spherical display system has quickly become a popular tool to convey societal benefits of these endeavors. Audiences of all ages instinctively relate to recent weather events which keeps them engaged in spherical display presentations. McIDAS facilitates further exploration of the science behind the weather phenomena. Audience feedback fuels the collaborative efforts of outreach specialists and computer programmers which provides continuous evolution of the 3D displays and McIDAS. This iterative presentation strategy is proving to be beneficial to our outreach program as seen by the success of our workshops, educational lectures and temporary exhibits at high visibility venues such as Madison Children’s Museum, the Milwaukee Public Museum and EAA AirVenture Museum. 3D Spherical Display System and McIDAS-V depiction of Hurricane Wilma
Numerical Simulation on a Possible Formation Mechanism of Interplanetary Magnetic Cloud Boundaries
NASA Astrophysics Data System (ADS)
Fan, Quan-Lin; Wei, Feng-Si; Feng, Xue-Shang
2003-08-01
The formation mechanism of the interplanetary magnetic cloud (MC) boundaries is numerically investigated by simulating the interactions between an MC of some initial momentum and a local interplanetary current sheet. The compressible 2.5D MHD equations are solved. Results show that the magnetic reconnection process is a possible formation mechanism when an MC interacts with a surrounding current sheet. A number of interesting features are found. For instance, the front boundary of the MCs is a magnetic reconnection boundary that could be caused by a driven reconnection ahead of the cloud, and the tail boundary might be caused by the driving of the entrained flow as a result of the Bernoulli principle. Analysis of the magnetic field and plasma data demonstrates that at these two boundaries appear large value of the plasma parameter β, clear increase of plasma temperature and density, distinct decrease of magnetic magnitude, and a transition of magnetic field direction of about 180 degrees. The outcome of the present simulation agrees qualitatively with the observational results on MC boundary inferred from IMP-8, etc. The project supported by National Natural Science Foundation of China under Grant Nos. 40104006, 49925412, and 49990450
Cho, Jongmin; Gonzalez-Lepera, Carlos; Manohar, Nivedh; Kerr, Matthew; Krishnan, Sunil; Cho, Sang Hyun
2016-03-21
Some investigators have shown tumor cell killing enhancement in vitro and tumor regression in mice associated with the loading of gold nanoparticles (GNPs) before proton treatments. Several Monte Carlo (MC) investigations have also demonstrated GNP-mediated proton dose enhancement. However, further studies need to be done to quantify the individual physical factors that contribute to the dose enhancement or cell-kill enhancement (or radiosensitization). Thus, the current study investigated the contributions of particle-induced x-ray emission (PIXE), particle-induced gamma-ray emission (PIGE), Auger and secondary electrons, and activation products towards the total dose enhancement. Specifically, GNP-mediated dose enhancement was measured using strips of radiochromic film that were inserted into vials of cylindrical GNPs, i.e. gold nanorods (GNRs), dispersed in a saline solution (0.3 mg of GNRs/g or 0.03% of GNRs by weight), as well as vials containing water only, before proton irradiation. MC simulations were also performed with the tool for particle simulation code using the film measurement setup. Additionally, a high-purity germanium detector system was used to measure the photon spectrum originating from activation products created from the interaction of protons and spherical GNPs present in a saline solution (20 mg of GNPs/g or 2% of GNPs by weight). The dose enhancement due to PIXE/PIGE recorded on the films in the GNR-loaded saline solution was less than the experimental uncertainty of the film dosimetry (<2%). MC simulations showed highly localized dose enhancement (up to a factor 17) in the immediate vicinity (<100 nm) of GNRs, compared with hypothetical water nanorods (WNRs), mostly due to GNR-originated Auger/secondary electrons; however, the average dose enhancement over the entire GNR-loaded vial was found to be minimal (0.1%). The dose enhancement due to the activation products from GNPs was minimal (<0.1%) as well. In conclusion, under the currently investigated conditions that are considered clinically relevant, PIXE, PIGE, and activation products contribute minimally to GNP/GNR-mediated proton dose enhancement, whereas Auger/secondary electrons contribute significantly but only at short distances (<100 nm) from GNPs/GNRs.
NASA Astrophysics Data System (ADS)
Tian, Jian
With the recently-developed particle-resolved model PartMC-MOSAIC, the mixing state and other physico-chemical properties of individual aerosol particles can be tracked as the particles undergo aerosol aging processes. However, existing PartMC-MOSAIC applications have mainly been based on idealized scenarios, and a link to real atmospheric measurement has not yet been established. In this thesis, we extend the capability of PartMC-MOSAIC and apply the model framework to three distinct scenarios with different environmental conditions to investigate the physical and chemical aging of aerosols in those environments. The first study is to investigate the evolution of particle mixing state and cloud condensation nuclei (CCN) activation properties in a ship plume. Comparisons of our results with observations from the QUANTIFY Study in 2007 in the English channel and the Gulf of Biscay showed that the model was able to reproduce the observed evolution of total number concentration and the vanishing of the nucleation mode consisting of sulfate particles. Further process analysis revealed that during the first hour after emission, dilution reduced the total number concentration by four orders of magnitude, while coagulation reduced it by an additional order of magnitude. Neglecting coagulation resulted in an overprediction of more than one order of magnitude in the number concentration of particles smaller than 40 nm at a plume age of 100 s. Coagulation also significantly altered the mixing state of the particles, leading to a continuum of internal mixtures of sulfate and black carbon. The impact of condensation on CCN concentrations depended on the supersaturation threshold at which CCN activity was evaluated. Nucleation was observed to have a limited impact on the CCN concentration in the ship plume we studied, but was sensitive to formation rates of secondary aerosol. For the second study we adapted PartMC to represent the aerosol evolution in an aerosol chamber, with the intention to use the model as a tool to interpret and guide chamber experiments in the future. We added chamber-specific processes to our model formulation such as wall loss due to particle diffusion and sedimentation, and dilution effects due to sampling. We also implemented a treatment of fractal particles to account for the morphology of agglomerates and its impact on aerosol dynamics. We verified the model with published results of self-similar size distributions, and validated the model using experimental data from an aerosol chamber. To this end we developed a fitting optimization approach to determine the best-estimate values for the wall loss parameters based on minimizing the l2-norm of the model errors of the number distribution. Obtaining the best fit required taking into account the non-spherical structure of the particle agglomerates. Our third study focuses on the implementation of volatility basis set (VBS) framework in PartMC-MOSAIC to investigate the chemical aging of organic aerosols in the atmosphere. The updated PartMC-MOSAIC model framework was used to simulate the evolution of aerosols in air trajectories initialized from CARES field campaign conducted in California in June 2010. The simulation results were compared with aircraft measurement data during the campaign. PartMC-MOSAIC was able to produce gas and aerosol concentrations at similar levels compared to the observation data. Moreover, the simulation with VBS enabled produced consistently more secondary organic aerosols (SOA). The investigation of particle mixing state revealed that the impact of VBS framework on particle mixing state is sensitive to the daylight exposure time. (Abstract shortened by ProQuest.).
Radiation Measurements in Simulated Ablation Layers
2010-12-06
J.Spacecraft & Rockets, V35, No 6, 1998, pp 729-735. D‟Souza MG, Eichmann TN, Mudford NR, Potter DF, Morgan RG, McIntyre TJ, Jacobs PA (2009...gases. D. Phil Thesis. Oxford University 1976 Potter, D., Eichmann , T., Brandis, A., Morgan, R., Jacobs, P., McIntyre, T., “Simulation of radiating...Heatshield Material. 46th AIAA Aerospace Sciences Meeting and Exhibit, AIAA2008-1202, Reno, USA. D‟Souza, M.G., Eichmann , T.N., Mudford, N.R., Potter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeo, Sang Chul; Lee, Hyuck Mo, E-mail: hmlee@kaist.ac.kr; Lo, Yu Chieh
2014-10-07
Ammonia (NH{sub 3}) nitridation on an Fe surface was studied by combining density functional theory (DFT) and kinetic Monte Carlo (kMC) calculations. A DFT calculation was performed to obtain the energy barriers (E{sub b}) of the relevant elementary processes. The full mechanism of the exact reaction path was divided into five steps (adsorption, dissociation, surface migration, penetration, and diffusion) on an Fe (100) surface pre-covered with nitrogen. The energy barrier (E{sub b}) depended on the N surface coverage. The DFT results were subsequently employed as a database for the kMC simulations. We then evaluated the NH{sub 3} nitridation rate onmore » the N pre-covered Fe surface. To determine the conditions necessary for a rapid NH{sub 3} nitridation rate, the eight reaction events were considered in the kMC simulations: adsorption, desorption, dissociation, reverse dissociation, surface migration, penetration, reverse penetration, and diffusion. This study provides a real-time-scale simulation of NH{sub 3} nitridation influenced by nitrogen surface coverage that allowed us to theoretically determine a nitrogen coverage (0.56 ML) suitable for rapid NH{sub 3} nitridation. In this way, we were able to reveal the coverage dependence of the nitridation reaction using the combined DFT and kMC simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew Ellis; Derek Gaston; Benoit Forget
In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less
New simulation model of multicomponent crystal growth and inhibition.
Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao
2004-04-02
We review a novel computational model for the study of crystal structures both on their own and in conjunction with inhibitor molecules. The model advances existing Monte Carlo (MC) simulation techniques by extending them from modeling 3D crystal surface patches to modeling entire 3D crystals, and by including the use of "complex" multicomponent molecules within the simulations. These advances makes it possible to incorporate the 3D shape and non-uniform surface properties of inhibitors into simulations, and to study what effect these inhibitor properties have on the growth of whole crystals containing up to tens of millions of molecules. The application of this extended MC model to the study of antifreeze proteins (AFPs) and their effects on ice formation is reported, including the success of the technique in achieving AFP-induced ice-growth inhibition with concurrent changes to ice morphology that mimic experimental results. Simulations of ice-growth inhibition suggest that the degree of inhibition afforded by an AFP is a function of its ice-binding position relative to the underlying anisotropic growth pattern of ice. This extended MC technique is applicable to other crystal and crystal-inhibitor systems, including more complex crystal systems such as clathrates.
An adaptive bias - hybrid MD/kMC algorithm for protein folding and aggregation.
Peter, Emanuel K; Shea, Joan-Emma
2017-07-05
In this paper, we present a novel hybrid Molecular Dynamics/kinetic Monte Carlo (MD/kMC) algorithm and apply it to protein folding and aggregation in explicit solvent. The new algorithm uses a dynamical definition of biases throughout the MD component of the simulation, normalized in relation to the unbiased forces. The algorithm guarantees sampling of the underlying ensemble in dependency of one average linear coupling factor 〈α〉 τ . We test the validity of the kinetics in simulations of dialanine and compare dihedral transition kinetics with long-time MD-simulations. We find that for low 〈α〉 τ values, kinetics are in good quantitative agreement. In folding simulations of TrpCage and TrpZip4 in explicit solvent, we also find good quantitative agreement with experimental results and prior MD/kMC simulations. Finally, we apply our algorithm to study growth of the Alzheimer Amyloid Aβ 16-22 fibril by monomer addition. We observe two possible binding modes, one at the extremity of the fibril (elongation) and one on the surface of the fibril (lateral growth), on timescales ranging from ns to 8 μs.
A new method for shape and texture classification of orthopedic wear nanoparticles.
Zhang, Dongning; Page, Janet R; Kavanaugh, Aaron E; Billi, Fabrizio
2012-09-27
Detailed morphologic analysis of particles produced during wear of orthopedic implants is important in determining a correlation among material, wear, and biological effects. However, the use of simple shape descriptors is insufficient to categorize the data and to compare the nature of wear particles generated by different implants. An approach based on Discrete Fourier Transform (DFT) is presented for describing particle shape and surface texture. Four metal-on-metal bearing couples were tested in an orbital wear simulator under standard and adverse (steep-angled cups) wear simulator conditions. Digitized Scanning Electron Microscope (SEM) images of the wear particles were imported into MATLAB to carry out Fourier descriptor calculations via a specifically developed algorithm. The descriptors were then used for studying particle characteristics (shape and texture) as well as for cluster classification. Analysis of the particles demonstrated the validity of the proposed model by showing that steep-angle Co-Cr wear particles were more asymmetric, compressed, extended, triangular, square, and roughened at 3 Mc than after 0.25 Mc. In contrast, particles from standard angle samples were only more compressed and extended after 3 Mc compared to 0.25 Mc. Cluster analysis revealed that the 0.25 Mc steep-angle particle distribution was a subset of the 3 Mc distribution.
Ma, Yunzhi; Lacroix, Fréderic; Lavallée, Marie-Claude; Beaulieu, Luc
2015-01-01
To validate the Advanced Collapsed cone Engine (ACE) dose calculation engine of Oncentra Brachy (OcB) treatment planning system using an (192)Ir source. Two levels of validation were performed, conformant to the model-based dose calculation algorithm commissioning guidelines of American Association of Physicists in Medicine TG-186 report. Level 1 uses all-water phantoms, and the validation is against TG-43 methodology. Level 2 uses real-patient cases, and the validation is against Monte Carlo (MC) simulations. For each case, the ACE and TG-43 calculations were performed in the OcB treatment planning system. ALGEBRA MC system was used to perform MC simulations. In Level 1, the ray effect depends on both accuracy mode and the number of dwell positions. The volume fraction with dose error ≥2% quickly reduces from 23% (13%) for a single dwell to 3% (2%) for eight dwell positions in the standard (high) accuracy mode. In Level 2, the 10% and higher isodose lines were observed overlapping between ACE (both standard and high-resolution modes) and MC. Major clinical indices (V100, V150, V200, D90, D50, and D2cc) were investigated and validated by MC. For example, among the Level 2 cases, the maximum deviation in V100 of ACE from MC is 2.75% but up to ~10% for TG-43. Similarly, the maximum deviation in D90 is 0.14 Gy between ACE and MC but up to 0.24 Gy for TG-43. ACE demonstrated good agreement with MC in most clinically relevant regions in the cases tested. Departure from MC is significant for specific situations but limited to low-dose (<10% isodose) regions. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Improved importance sampling technique for efficient simulation of digital communication systems
NASA Technical Reports Server (NTRS)
Lu, Dingqing; Yao, Kung
1988-01-01
A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.
SU-F-T-50: Evaluation of Monte Carlo Simulations Performance for Pediatric Brachytherapy Dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatzipapas, C; Kagadis, G; Papadimitroulas, P
Purpose: Pediatric tumors are generally treated with multi-modal procedures. Brachytherapy can be used with pediatric tumors, especially given that in this patient population low toxicity on normal tissues is critical as is the suppression of the probability for late malignancies. Our goal is to validate the GATE toolkit on realistic brachytherapy applications, and evaluate brachytherapy plans on pediatrics for accurate dosimetry on sensitive and critical organs of interest. Methods: The GATE Monte Carlo (MC) toolkit was used. Two High Dose Rate (HDR) 192Ir brachytherapy sources were simulated (Nucletron mHDR-v1 and Varian VS2000), and fully validated using the AAPM and ESTROmore » protocols. A realistic brachytherapy plan was also simulated using the XCAT anthropomorphic computational model .The simulated data were compared to the clinical dose points. Finally, a 14 years old girl with vaginal rhabdomyosarcoma was modelled based on clinical procedures for the calculation of the absorbed dose per organ. Results: The MC simulations resulted in accurate dosimetry in terms of dose rate constant (Λ), radial dose gL(r) and anisotropy function F(r,θ) for both sources.The simulations were executed using ∼1010 number of primaries resulting in statistical uncertainties lower than 2%.The differences between the theoretical values and the simulated ones ranged from 0.01% up to 3.3%, with the largest discrepancy (6%) being observed in the dose rate constant calculation.The simulated DVH using an adult female XCAT model was also compared to a clinical one resulting in differences smaller than 5%. Finally, a realistic pediatric brachytherapy simulation was performed to evaluate the absorbed dose per organ and to calculate DVH with respect to heterogeneities of the human anatomy. Conclusion: GATE is a reliable tool for brachytherapy simulations both for source modeling and for dosimetry in anthropomorphic voxelized models. Our project aims to evaluate a variety of pediatric brachytherapy schemes using a population of pediatric phantoms for several pathological cases. This study is part of a project that has received funding from the European Union Horizon2020 research and innovation programme under the MarieSklodowska-Curiegrantagreement.No691203.The results published in this study reflect only the authors view and the Research Executive Agency (REA) and the European Commission is not responsible for any use that may be madeof the information it contains.« less
NASA Astrophysics Data System (ADS)
Limbu, Dil; Biswas, Parthapratim
We present a simple and efficient Monte-Carlo (MC) simulation of Iron (Fe) and Nickel (Ni) clusters with N =5-100 and amorphous Silicon (a-Si) starting from a random configuration. Using Sutton-Chen and Finnis-Sinclair potentials for Ni (in fcc lattice) and Fe (in bcc lattice), and Stillinger-Weber potential for a-Si, respectively, the total energy of the system is optimized by employing MC moves that include both the stochastic nature of MC simulations and the gradient of the potential function. For both iron and nickel clusters, the energy of the configurations is found to be very close to the values listed in the Cambridge Cluster Database, whereas the maximum force on each cluster is found to be much lower than the corresponding value obtained from the optimized structural configurations reported in the database. An extension of the method to model the amorphous state of Si is presented and the results are compared with experimental data and those obtained from other simulation methods. The work is partially supported by the NSF under Grant Number DMR 1507166.
Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, Leiph
Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less
NASA Astrophysics Data System (ADS)
Yonezawa, Yasushige; Shimoyama, Hiromitsu; Nakamura, Haruki
2011-01-01
Multicanonical molecular-dynamics (McMD) simulation and Metadynamics (MetaD) are useful for obtaining the free-energies, and can be mutually complementary. We combined McMD with MetaD, and applied it to the conformational free energy calculations of a proline dipeptide. First, MetaD was performed along the dihedral angle at the prolyl bond and we obtained a coarse biasing potential. After adding the biasing potential to the dihedral angle potential energy, we conducted McMD with the modified potential energy. Enhanced sampling was achieved for all degrees-of-freedom, and the sampling of the dihedral angle space was facilitated. After reweighting, we obtained an accurate free energy landscape.
Dosimetric quality control of Eclipse treatment planning system using pelvic digital test object
NASA Astrophysics Data System (ADS)
Benhdech, Yassine; Beaumont, Stéphane; Guédon, Jeanpierre; Crespin, Sylvain
2011-03-01
Last year, we demonstrated the feasibility of a new method to perform dosimetric quality control of Treatment Planning Systems in radiotherapy, this method is based on Monte-Carlo simulations and uses anatomical Digital Test Objects (DTOs). The pelvic DTO was used in order to assess this new method on an ECLIPSE VARIAN Treatment Planning System. Large dose variations were observed particularly in air and bone equivalent material. In this current work, we discuss the results of the previous paper and provide an explanation for observed dose differences, the VARIAN Eclipse (Anisotropic Analytical) algorithm was investigated. Monte Carlo simulations (MC) were performed with a PENELOPE code version 2003. To increase efficiency of MC simulations, we have used our parallelized version based on the standard MPI (Message Passing Interface). The parallel code has been run on a 32- processor SGI cluster. The study was carried out using pelvic DTOs and was performed for low- and high-energy photon beams (6 and 18MV) on 2100CD VARIAN linear accelerator. A square field (10x10 cm2) was used. Assuming the MC data as reference, χ index analyze was carried out. For this study, a distance to agreement (DTA) was set to 7mm while the dose difference was set to 5% as recommended in the TRS-430 and TG-53 (on the beam axis in 3-D inhomogeneities). When using Monte Carlo PENELOPE, the absorbed dose is computed to the medium, however the TPS computes dose to water. We have used the method described by Siebers et al. based on Bragg-Gray cavity theory to convert MC simulated dose to medium to dose to water. Results show a strong consistency between ECLIPSE and MC calculations on the beam axis.
Absolute dose calculations for Monte Carlo simulations of radiotherapy beams
NASA Astrophysics Data System (ADS)
Popescu, I. A.; Shaw, C. P.; Zavgorodni, S. F.; Beckham, W. A.
2005-07-01
Monte Carlo (MC) simulations have traditionally been used for single field relative comparisons with experimental data or commercial treatment planning systems (TPS). However, clinical treatment plans commonly involve more than one field. Since the contribution of each field must be accurately quantified, multiple field MC simulations are only possible by employing absolute dosimetry. Therefore, we have developed a rigorous calibration method that allows the incorporation of monitor units (MU) in MC simulations. This absolute dosimetry formalism can be easily implemented by any BEAMnrc/DOSXYZnrc user, and applies to any configuration of open and blocked fields, including intensity-modulated radiation therapy (IMRT) plans. Our approach involves the relationship between the dose scored in the monitor ionization chamber of a radiotherapy linear accelerator (linac), the number of initial particles incident on the target, and the field size. We found that for a 10 × 10 cm2 field of a 6 MV photon beam, 1 MU corresponds, in our model, to 8.129 × 1013 ± 1.0% electrons incident on the target and a total dose of 20.87 cGy ± 1.0% in the monitor chambers of the virtual linac. We present an extensive experimental verification of our MC results for open and intensity-modulated fields, including a dynamic 7-field IMRT plan simulated on the CT data sets of a cylindrical phantom and of a Rando anthropomorphic phantom, which were validated by measurements using ionization chambers and thermoluminescent dosimeters (TLD). Our simulation results are in excellent agreement with experiment, with percentage differences of less than 2%, in general, demonstrating the accuracy of our Monte Carlo absolute dose calculations.
Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry
ERIC Educational Resources Information Center
Thadani, Vandana; Bouvier-Brown, Nicole C.
2016-01-01
College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…
Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda
2018-02-01
A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.
An energy function for dynamics simulations of polypeptides in torsion angle space
NASA Astrophysics Data System (ADS)
Sartori, F.; Melchers, B.; Böttcher, H.; Knapp, E. W.
1998-05-01
Conventional simulation techniques to model the dynamics of proteins in atomic detail are restricted to short time scales. A simplified molecular description, in which high frequency motions with small amplitudes are ignored, can overcome this problem. In this protein model only the backbone dihedrals φ and ψ and the χi of the side chains serve as degrees of freedom. Bond angles and lengths are fixed at ideal geometry values provided by the standard molecular dynamics (MD) energy function CHARMM. In this work a Monte Carlo (MC) algorithm is used, whose elementary moves employ cooperative rotations in a small window of consecutive amide planes, leaving the polypeptide conformation outside of this window invariant. A single of these window MC moves generates local conformational changes only. But, the application of many such moves at different parts of the polypeptide backbone leads to global conformational changes. To account for the lack of flexibility in the protein model employed, the energy function used to evaluate conformational energies is split into sequentially neighbored and sequentially distant contributions. The sequentially neighbored part is represented by an effective (φ,ψ)-torsion potential. It is derived from MD simulations of a flexible model dipeptide using a conventional MD energy function. To avoid exaggeration of hydrogen bonding strengths, the electrostatic interactions involving hydrogen atoms are scaled down at short distances. With these adjustments of the energy function, the rigid polypeptide model exhibits the same equilibrium distributions as obtained by conventional MD simulation with a fully flexible molecular model. Also, the same temperature dependence of the stability and build-up of α helices of 18-alanine as found in MD simulations is observed using the adapted energy function for MC simulations. Analyses of transition frequencies demonstrate that also dynamical aspects of MD trajectories are faithfully reproduced. Finally, it is demonstrated that even for high temperature unfolded polypeptides the MC simulation is more efficient by a factor of 10 than conventional MD simulations.
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
Campbell, Bruce G.; Landmeyer, James E.
2014-01-01
Chesterfield County is located in the northeastern part of South Carolina along the southern border of North Carolina and is primarily underlain by unconsolidated sediments of Late Cretaceous age and younger of the Atlantic Coastal Plain. Approximately 20 percent of Chesterfield County is in the Piedmont Physiographic Province, and this area of the county is not included in this study. These Atlantic Coastal Plain sediments compose two productive aquifers: the Crouch Branch aquifer that is present at land surface across most of the county and the deeper, semi-confined McQueen Branch aquifer. Most of the potable water supplied to residents of Chesterfield County is produced from the Crouch Branch and McQueen Branch aquifers by a well field located near McBee, South Carolina, in the southwestern part of the county. Overall, groundwater availability is good to very good in most of Chesterfield County, especially the area around and to the south of McBee, South Carolina. The eastern part of Chesterfield County does not have as abundant groundwater resources but resources are generally adequate for domestic purposes. The primary purpose of this study was to determine groundwater-flow rates, flow directions, and changes in water budgets over time for the Crouch Branch and McQueen Branch aquifers in the Chesterfield County area. This goal was accomplished by using the U.S. Geological Survey finite-difference MODFLOW groundwater-flow code to construct and calibrate a groundwater-flow model of the Atlantic Coastal Plain of Chesterfield County. The model was created with a uniform grid size of 300 by 300 feet to facilitate a more accurate simulation of groundwater-surface-water interactions. The model consists of 617 rows from north to south extending about 35 miles and 884 columns from west to east extending about 50 miles, yielding a total area of about 1,750 square miles. However, the active part of the modeled area, or the part where groundwater flow is simulated, totaled about 1,117 square miles. Major types of data used as input to the model included groundwater levels, groundwater-use data, and hydrostratigraphic data, along with estimates and measurements of stream base flows made specifically for this study. The groundwater-flow model was calibrated to groundwater-level and stream base-flow conditions from 1900 to 2012 using 39 stress periods. The model was calibrated with an automated parameter-estimation approach using the computer program PEST, and the model used regularized inversion and pilot points. The groundwater-flow model was calibrated using field data that included groundwater levels that had been collected between 1940 and 2012 from 239 wells and base-flow measurements from 44 locations distributed within the study area. To better understand recharge and inter-aquifer interactions, seven wells were equipped with continuous groundwater-level recording equipment during the course of the study, between 2008 and 2012. These water levels were included in the model calibration process. The observed groundwater levels were compared to the simulated ones, and acceptable calibration fits were achieved. Root mean square error for the simulated groundwater levels compared to all observed groundwater levels was 9.3 feet for the Crouch Branch aquifer and 8.6 feet for the McQueen Branch aquifer. The calibrated groundwater-flow model was then used to calculate groundwater budgets for the entire study area and for two sub-areas. The sub-areas are the Alligator Rural Water and Sewer Company well field near McBee, South Carolina, and the Carolina Sandhills National Wildlife Refuge acquisition boundary area. For the overall model area, recharge rates vary from 56 to 1,679 million gallons per day (Mgal/d) with a mean of 737 Mgal/d over the simulation period (1900–2012). The simulated water budget for the streams and rivers varies from 653 to 1,127 Mgal/d with a mean of 944 Mgal/d. The simulated “storage-in term” ranges from 0 to 565 Mgal/d with a mean of 276 Mgal/d. The simulated “storage-out term” has a range of 0 to 552 Mgal/d with a mean of 77 Mgal/d. Groundwater budgets for the McBee, South Carolina, area and the Carolina Sandhills National Wildlife Refuge acquisition area had similar results. An analysis of the effects of past and current groundwater withdrawals on base flows in the McBee area indicated a negligible effect of pumping from the Alligator Rural Water and Sewer well field on local stream base flows. Simulate base flows for 2012 for selected streams in and around the McBee area were similar with and without simulated groundwater withdrawals from the well field. Removing all pumping from the model for the entire simulation period (1900–2012) produces a negligible difference in increased base flow for the selected streams. The 2012 flow for Lower Alligator Creek was 5.04 Mgal/d with the wells pumping and 5.08 Mgal/d without the wells pumping; this represents the largest difference in simulated flows for the six streams.
Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki
2009-10-01
To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.
NASA Astrophysics Data System (ADS)
Armaghani, Danial Jahed; Mahdiyar, Amir; Hasanipanah, Mahdi; Faradonbeh, Roohollah Shirani; Khandelwal, Manoj; Amnieh, Hassan Bakhshandeh
2016-09-01
Flyrock is considered as one of the main causes of human injury, fatalities, and structural damage among all undesirable environmental impacts of blasting. Therefore, it seems that the proper prediction/simulation of flyrock is essential, especially in order to determine blast safety area. If proper control measures are taken, then the flyrock distance can be controlled, and, in return, the risk of damage can be reduced or eliminated. The first objective of this study was to develop a predictive model for flyrock estimation based on multiple regression (MR) analyses, and after that, using the developed MR model, flyrock phenomenon was simulated by the Monte Carlo (MC) approach. In order to achieve objectives of this study, 62 blasting operations were investigated in Ulu Tiram quarry, Malaysia, and some controllable and uncontrollable factors were carefully recorded/calculated. The obtained results of MC modeling indicated that this approach is capable of simulating flyrock ranges with a good level of accuracy. The mean of simulated flyrock by MC was obtained as 236.3 m, while this value was achieved as 238.6 m for the measured one. Furthermore, a sensitivity analysis was also conducted to investigate the effects of model inputs on the output of the system. The analysis demonstrated that powder factor is the most influential parameter on fly rock among all model inputs. It is noticeable that the proposed MR and MC models should be utilized only in the studied area and the direct use of them in the other conditions is not recommended.
ERIC Educational Resources Information Center
Waters, John K.
2012-01-01
In the case of higher education, the hills are more like mountains of data that "we're accumulating at a ferocious rate," according to Gerry McCartney, CIO of Purdue University (Indiana). "Every higher education institution has this data, but it just sits there like gold in the ground," complains McCartney. Big Data and the new tools people are…
Electronic Thesis Initiative: Pilot Project of McGill University, Montreal
ERIC Educational Resources Information Center
Park, Eun G.; Zou, Qing; McKnight, David
2007-01-01
Purpose: To set up a protocol for electronic thesis and dissertation (ETD) submission for the electronic thesis initiative pilot project at McGill University in Montreal, Canada. Design/methodology/approach: An electronic thesis and dissertation submission protocol was implemented and tested. To test authoring tools, we had 50 students submit…
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
2000-01-01
The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, A; Han, B; Bush, K
Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluencemore » by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.« less
NASA Astrophysics Data System (ADS)
Zhang, Shuying; Zhou, Xiaoqing; Qin, Zhuanping; Zhao, Huijuan
2011-02-01
This article aims at the development of the fast inverse Monte Carlo (MC) simulation for the reconstruction of optical properties (absorption coefficient μs and scattering coefficient μs) of cylindrical tissue, such as a cervix, from the measurement of near infrared diffuse light on frequency domain. Frequency domain information (amplitude and phase) is extracted from the time domain MC with a modified method. To shorten the computation time in reconstruction of optical properties, efficient and fast forward MC has to be achieved. To do this, firstly, databases of the frequency-domain information under a range of μa and μs were pre-built by combining MC simulation with Lambert-Beer's law. Then, a double polynomial model was adopted to quickly obtain the frequency-domain information in any optical properties. Based on the fast forward MC, the optical properties can be quickly obtained in a nonlinear optimization scheme. Reconstruction resulting from simulated data showed that the developed inverse MC method has the advantages in both the reconstruction accuracy and computation time. The relative errors in reconstruction of the μs and μs are less than +/-6% and +/-12% respectively, while another coefficient (μs or μs) is in a fixed value. When both μs and μs are unknown, the relative errors in reconstruction of the reduced scattering coefficient and absorption coefficient are mainly less than +/-10% in range of 45< μs <80 cm-1 and 0.25< a μ <0.55 cm-1. With the rapid reconstruction strategy developed in this article the computation time for reconstructing one set of the optical properties is less than 0.5 second. Endoscopic measurement on two tubular solid phantoms were also carried out to evaluate the system and the inversion scheme. The results demonstrated that less than 20% relative error can be achieved.
Clinical Outcome Metrics for Optimization of Robust Training
NASA Technical Reports Server (NTRS)
Ebert, D.; Byrne, V. E.; McGuire, K. M.; Hurst, V. W., IV; Kerstman, E. L.; Cole, R. W.; Sargsyan, A. E.; Garcia, K. M.; Reyes, D.; Young, M.
2016-01-01
Introduction: The emphasis of this research is on the Human Research Program (HRP) Exploration Medical Capability's (ExMC) "Risk of Unacceptable Health and Mission Outcomes Due to Limitations of In-Flight Medical Capabilities." Specifically, this project aims to contribute to the closure of gap ExMC 2.02: We do not know how the inclusion of a physician crew medical officer quantitatively impacts clinical outcomes during exploration missions. The experiments are specifically designed to address clinical outcome differences between physician and non-physician cohorts in both near-term and longer-term (mission impacting) outcomes. Methods: Medical simulations will systematically compare success of individual diagnostic and therapeutic procedure simulations performed by physician and non-physician crew medical officer (CMO) analogs using clearly defined short-term (individual procedure) outcome metrics. In the subsequent step of the project, the procedure simulation outcomes will be used as input to a modified version of the NASA Integrated Medical Model (IMM) to analyze the effect of the outcome (degree of success) of individual procedures (including successful, imperfectly performed, and failed procedures) on overall long-term clinical outcomes and the consequent mission impacts. The procedures to be simulated are endotracheal intubation, fundoscopic examination, kidney/urinary ultrasound, ultrasound-guided intravenous catheter insertion, and a differential diagnosis exercise. Multiple assessment techniques will be used, centered on medical procedure simulation studies occurring at 3, 6, and 12 months after initial training (as depicted in the following flow diagram of the experiment design). Discussion: Analysis of procedure outcomes in the physician and non-physician groups and their subsets (tested at different elapsed times post training) will allow the team to 1) define differences between physician and non-physician CMOs in terms of both procedure performance (pre-IMM analysis) and overall mitigation of the mission medical impact (IMM analysis); 2) refine the procedure outcome and clinical outcome metrics themselves; 3) refine or develop innovative medical training products and solutions to maximize CMO performance; and 4) validate the methods and products of this experiment for operational use in the planning, execution, and quality assurance of the CMO training process The team has finalized training protocols and developed a software training/testing tool in collaboration with Butler Graphics (Detroit, MI). In addition to the "hands on" medical procedure modules, the software includes a differential diagnosis exercise (limited clinical decision support tool) to evaluate the diagnostic skills of participants. Human subject testing will occur over the next year.
SUPERNOVA DRIVING. I. THE ORIGIN OF MOLECULAR CLOUD TURBULENCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padoan, Paolo; Pan, Liubin; Haugbølle, Troels
2016-05-01
Turbulence is ubiquitous in molecular clouds (MCs), but its origin is still unclear because MCs are usually assumed to live longer than the turbulence dissipation time. Interstellar medium (ISM) turbulence is likely driven by supernova (SN) explosions, but it has never been demonstrated that SN explosions can establish and maintain a turbulent cascade inside MCs consistent with the observations. In this work, we carry out a simulation of SN-driven turbulence in a volume of (250 pc){sup 3}, specifically designed to test if SN driving alone can be responsible for the observed turbulence inside MCs. We find that SN driving establishesmore » a velocity scaling consistent with the usual scaling laws of supersonic turbulence, suggesting that previous idealized simulations of MC turbulence, driven with a random, large-scale volume force, were correctly adopted as appropriate models for MC turbulence, despite the artificial driving. We also find that the same scaling laws extend to the interiors of MCs, and that the velocity–size relation of the MCs selected from our simulation is consistent with that of MCs from the Outer-Galaxy Survey, the largest MC sample available. The mass–size relation and the mass and size probability distributions also compare successfully with those of the Outer Galaxy Survey. Finally, we show that MC turbulence is super-Alfvénic with respect to both the mean and rms magnetic-field strength. We conclude that MC structure and dynamics are the natural result of SN-driven turbulence.« less
Amoush, Ahmad; Wilkinson, Douglas A.
2015-01-01
This work is a comparative study of the dosimetry calculated by Plaque Simulator, a treatment planning system for eye plaque brachytherapy, to the dosimetry calculated using Monte Carlo simulation for an Eye Physics model EP917 eye plaque. Monte Carlo (MC) simulation using MCNPX 2.7 was used to calculate the central axis dose in water for an EP917 eye plaque fully loaded with 17 IsoAid Advantage 125I seeds. In addition, the dosimetry parameters Λ, gL(r), and F(r,θ) were calculated for the IsoAid Advantage model IAI‐125 125I seed and benchmarked against published data. Bebig Plaque Simulator (PS) v5.74 was used to calculate the central axis dose based on the AAPM Updated Task Group 43 (TG‐43U1) dose formalism. The calculated central axis dose from MC and PS was then compared. When the MC dosimetry parameters for the IsoAid Advantage 125I seed were compared with the consensus values, Λ agreed with the consensus value to within 2.3%. However, much larger differences were found between MC calculated gL(r) and F(r,θ) and the consensus values. The differences between MC‐calculated dosimetry parameters are much smaller when compared with recently published data. The differences between the calculated central axis absolute dose from MC and PS ranged from 5% to 10% for distances between 1 and 12 mm from the outer scleral surface. When the dosimetry parameters for the 125I seed from this study were used in PS, the calculated absolute central axis dose differences were reduced by 2.3% from depths of 4 to 12 mm from the outer scleral surface. We conclude that PS adequately models the central dose profile of this plaque using its defaults for the IsoAid model IAI‐125 at distances of 1 to 7 mm from the outer scleral surface. However, improved dose accuracy can be obtained by using updated dosimetry parameters for the IsoAid model IAI‐125 125I seed. PACS number: 87.55.K‐ PMID:26699577
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y; Cai, J; Meltsner, S
2016-06-15
Purpose: The Varian tandem and ring applicators are used to deliver HDR Ir-192 brachytherapy for cervical cancer. The source path within the ring is hard to predict due to the larger interior ring lumen. Some studies showed the source could be several millimeters different from planned positions, while other studies demonstrated minimal dosimetric impact. A global shift can be applied to limit the effect of positioning offsets. The purpose of this study was to assess the necessities of implementing a global source shift using Monte Carlo (MC) simulations. Methods: The MCNP5 radiation transport code was used for all MC simulations.more » To accommodate TG-186 guidelines and eliminate inter-source attenuation, a BrachyVision plan with 10 dwell positions (0.5cm step sizes) was simulated as the summation of 10 individual sources with equal dwell times for simplification. To simplify the study, the tandem was also excluded from the MC model. Global shifts of ±0.1, ±0.3, ±0.5 cm were then simulated as distal and proximal from the reference positions. Dose was scored in water for all MC simulations and was normalized to 100% at the normalization point 0.5 cm from the cap in the ring plane. For dose comparison, Point A was 2 cm caudal from the buildup cap and 2 cm lateral on either side of the ring axis. With seventy simulations, 108 photon histories gave a statistical uncertainties (k=1) <2% for (0.1 cm)3 voxels. Results: Compared to no global shift, average Point A doses were 0.0%, 0.4%, and 2.2% higher for distal global shifts, and 0.4%, 2.8%, and 5.1% higher for proximal global shifts, respectively. The MC Point A doses differed by < 1% when compared to BrachyVision. Conclusion: Dose variations were not substantial for ±0.3 cm global shifts, which is common in clinical practice.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cawkwell, Marc Jon
2016-09-09
The MC3 code is used to perform Monte Carlo simulations in the isothermal-isobaric ensemble (constant number of particles, temperature, and pressure) on molecular crystals. The molecules within the periodic simulation cell are treated as rigid bodies, alleviating the requirement for a complex interatomic potential. Intermolecular interactions are described using generic, atom-centered pair potentials whose parameterization is taken from the literature [D. E. Williams, J. Comput. Chem., 22, 1154 (2001)] and electrostatic interactions arising from atom-centered, fixed, point partial charges. The primary uses of the MC3 code are the computation of i) the temperature and pressure dependence of lattice parameters andmore » thermal expansion coefficients, ii) tensors of elastic constants and compliances via the Parrinello and Rahman’s fluctuation formula [M. Parrinello and A. Rahman, J. Chem. Phys., 76, 2662 (1982)], and iii) the investigation of polymorphic phase transformations. The MC3 code is written in Fortran90 and requires LAPACK and BLAS linear algebra libraries to be linked during compilation. Computationally expensive loops are accelerated using OpenMP.« less
NASA Astrophysics Data System (ADS)
Chen, Zhe; Kecskes, Laszlo J.; Zhu, Kaigui; Wei, Qiuming
2016-12-01
Uniaxial tensile properties of monocrystalline tungsten (MC-W) and nanocrystalline tungsten (NC-W) with embedded hydrogen and helium atoms have been investigated using molecular dynamics (MD) simulations in the context of radiation damage evolution. Different strain rates have been imposed to investigate the strain rate sensitivity (SRS) of the samples. Results show that the plastic deformation processes of MC-W and NC-W are dominated by different mechanisms, namely dislocation-based for MC-W and grain boundary-based activities for NC-W, respectively. For MC-W, the SRS increases and a transition appears in the deformation mechanism with increasing embedded atom concentration. However, no obvious embedded atom concentration dependence of the SRS has been observed for NC-W. Instead, in the latter case, the embedded atoms facilitate GB sliding and intergranular fracture. Additionally, a strong strain enhanced He cluster growth has been observed. The corresponding underlying mechanisms are discussed.
Application of MC1 to Wind Cave National Park: Lessons from a small-scale study: Chapter 8
King, David A.; Bachelet, Dominique M.; Symstad, Amy J.
2015-01-01
MC1 was designed for application to large regions that include a wide range in elevation and topography, thereby encompassing a broad range in climates and vegetation types. The authors applied the dynamic global vegetation model MC1 to Wind Cave National Park (WCNP) in the southern Black Hills of South Dakota, USA, on the ecotone between ponderosa pine forest to the northwest and mixed-grass prairie to the southeast. They calibrated MC1 to simulate adequate fire effects in the warmer southeastern parts of the park to ensure grasslands there, while allowing forests to grow to the northwest, and then simulated future vegetation with climate projections from three GCMs. The results suggest that fire frequency, as affected by climate and/or human intervention, may be more important than the direct effects of climate in determining the distribution of ponderosa pine in the Black Hills region, both historically and in the future.
NASA Astrophysics Data System (ADS)
Doronin, Alexander; Meglinski, Igor
2017-02-01
Current report considers development of a unified Monte Carlo (MC) -based computational model for simulation of propagation of Laguerre-Gaussian (LG) beams in turbid tissue-like scattering medium. With a primary goal to proof the concept of using complex light for tissue diagnosis we explore propagation of LG beams in comparison with Gaussian beams for both linear and circular polarization. MC simulations of radially and azimuthally polarized LG beams in turbid media have been performed, classic phenomena such as preservation of the orbital angular momentum, optical memory and helicity flip are observed, detailed comparison is presented and discussed.
Random number generators for large-scale parallel Monte Carlo simulations on FPGA
NASA Astrophysics Data System (ADS)
Lin, Y.; Wang, F.; Liu, B.
2018-05-01
Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.
Radiation environment at LEO orbits: MC simulation and experimental data.
NASA Astrophysics Data System (ADS)
Zanini, Alba; Borla, Oscar; Damasso, Mario; Falzetta, Giuseppe
The evaluations of the different components of the radiation environment in spacecraft, both in LEO orbits and in deep space is of great importance because the biological effect on humans and the risk for instrumentation strongly depends on the kind of radiation (high or low LET). That is important especially in view of long term manned or unmanned space missions, (mission to Mars, solar system exploration). The study of space radiation field is extremely complex and not completely solved till today. Given the complexity of the radiation field, an accurate dose evaluation should be considered an indispensable part of any space mission. Two simulation codes (MCNPX and GEANT4) have been used to assess the secondary radiation inside FO-TON M3 satellite and ISS. The energy spectra of primary radiation at LEO orbits have been modelled by using various tools (SPENVIS, OMERE, CREME96) considering separately Van Allen protons, the GCR protons and the GCR alpha particles. This data are used as input for the two MC codes and transported inside the spacecraft. The results of two calculation meth-ods have been compared. Moreover some experimental results previously obtained on FOTON M3 satellite by using TLD, Bubble dosimeter and LIULIN detector are considered to check the performances of the two codes. Finally the same experimental device are at present collecting data on the ISS (ASI experiment BIOKIS -nDOSE) and at the end of the mission the results will be compared with the calculation.
Jefferys, Stuart R; Giddings, Morgan C
2011-03-15
Post-translational modifications are vital to the function of proteins, but are hard to study, especially since several modified isoforms of a protein may be present simultaneously. Mass spectrometers are a great tool for investigating modified proteins, but the data they provide is often incomplete, ambiguous and difficult to interpret. Combining data from multiple experimental techniques-especially bottom-up and top-down mass spectrometry-provides complementary information. When integrated with background knowledge this allows a human expert to interpret what modifications are present and where on a protein they are located. However, the process is arduous and for high-throughput applications needs to be automated. This article explores a data integration methodology based on Markov chain Monte Carlo and simulated annealing. Our software, the Protein Inference Engine (the PIE) applies these algorithms using a modular approach, allowing multiple types of data to be considered simultaneously and for new data types to be added as needed. Even for complicated data representing multiple modifications and several isoforms, the PIE generates accurate modification predictions, including location. When applied to experimental data collected on the L7/L12 ribosomal protein the PIE was able to make predictions consistent with manual interpretation for several different L7/L12 isoforms using a combination of bottom-up data with experimentally identified intact masses. Software, demo projects and source can be downloaded from http://pie.giddingslab.org/
2014-05-21
PERSONNEL FROM STANDARD PRIME BEEF OR RED 4F9K4 PROVIDES FOLDED FIBERGLASS MATTING ( FFM ) FOR AIRFIELD DAMAGE REPAIR (ADR). PACKAGE CONSISTS OF THREE FFM ...SETS (54’ X 60’), ONE FFM SUPPORT TOOL KIT, UPPER BUSHINGS, ANCHOR BUSHINGS, ANCHOR BOLTS, AND TWO EA MC-7 AIR COMPRESSORS. EACH UTC WILL BE TASKED TO...OF 7 FOLDED FIBERGLASS MAT SETS (54 FT X 60 FT), 2 FFM SUPPORT TOOL KIT, UPPER BUSHINGS, ANCHOR BUSHINGS, ANCHOR BOLTS AND 4 X MC-7 AIR COMPRESSORS
The effect of linear spring number at side load of McPherson suspension in electric city car
NASA Astrophysics Data System (ADS)
Budi, Sigit Setijo; Suprihadi, Agus; Makhrojan, Agus; Ismail, Rifky; Jamari, J.
2017-01-01
The function of the spring suspension on Mc Pherson type is to control vehicle stability and increase ride convenience although having tendencies of side load presence. The purpose of this study is to obtain simulation results of Mc Pherson suspension spring in the electric city car by using the finite element method and determining the side load that appears on the spring suspension. This research is conducted in several stages; they are linear spring designing models with various spring coil and spring suspension modeling using FEM software. Suspension spring is compressed in the vertical direction (z-axis) and at the upper part of the suspension springs will be seen the force that arises towards the x, y, and z-axis to simulate the side load arising on the upper part of the spring. The results of FEM simulation that the side load on the spring toward the x and y-axis which the value gets close to zero is the most stable spring.
NASA Astrophysics Data System (ADS)
Ilyasov, Ildar K.; Prikhodko, Constantin V.; Nevorotin, Alexey J.
1995-01-01
Monte Carlo (MC) simulation model and the thermoindicative tissue phantom were applied for evaluation of a depth of tissue necrosis (DTN) as a result of quasi-cw copper vapor laser (578 nm) irradiation. It has been shown that incident light focusing angle is essential for DTN. In particular, there was a significant rise in DTN parallel to elevation of this angle up to +20 degree(s)C and +5 degree(s)C for both the MC simulation and tissue phantom models, respectively, with no further increase in the necrosis depth above these angles. It is to be noted that the relationship between focusing angles and DTN values was apparently stronger for the real target compared to the MC-derived hypothetical one. To what extent these date are applicable for medical practice can be evaluated in animal models which would simulate laser-assisted therapy for PWS or related dermatologic lesions with converged 578 nm laser beams.
BCA-kMC Hybrid Simulation for Hydrogen and Helium Implantation in Material under Plasma Irradiation
NASA Astrophysics Data System (ADS)
Kato, Shuichi; Ito, Atsushi; Sasao, Mamiko; Nakamura, Hiroaki; Wada, Motoi
2015-09-01
Ion implantation by plasma irradiation into materials achieves the very high concentration of impurity. The high concentration of impurity causes the deformation and the destruction of the material. This is the peculiar phenomena in the plasma-material interaction (PMI). The injection process of plasma particles are generally simulated by using the binary collision approximation (BCA) and the molecular dynamics (MD), while the diffusion of implanted atoms have been traditionally solved by the diffusion equation, in which the implanted atoms is replaced by the continuous concentration field. However, the diffusion equation has insufficient accuracy in the case of low concentration, and in the case of local high concentration such as the hydrogen blistering and the helium bubble. The above problem is overcome by kinetic Monte Carlo (kMC) which represents the diffusion of the implanted atoms as jumps on interstitial sites in a material. In this paper, we propose the new approach ``BCA-kMC hybrid simulation'' for the hydrogen and helium implantation under the plasma irradiation.
Magnetic Levitation of MC3T3 Osteoblast Cells as a Ground-Based Simulation of Microgravity
Kidder, Louis S.; Williams, Philip C.; Xu, Wayne Wenzhong
2009-01-01
Diamagnetic samples placed in a strong magnetic field and a magnetic field gradient experience a magnetic force. Stable magnetic levitation occurs when the magnetic force exactly counter balances the gravitational force. Under this condition, a diamagnetic sample is in a simulated microgravity environment. The purpose of this study is to explore if MC3T3-E1 osteoblastic cells can be grown in magnetically simulated hypo-g and hyper-g environments and determine if gene expression is differentially expressed under these conditions. The murine calvarial osteoblastic cell line, MC3T3-E1, grown on Cytodex-3 beads, were subjected to a net gravitational force of 0, 1 and 2 g in a 17 T superconducting magnet for 2 days. Microarray analysis of these cells indicated that gravitational stress leads to up and down regulation of hundreds of genes. The methodology of sustaining long-term magnetic levitation of biological systems are discussed. PMID:20052306
LES of Temporally Evolving Mixing Layers by an Eighth-Order Filter Scheme
NASA Technical Reports Server (NTRS)
Hadjadj, A; Yee, H. C.; Sjogreen, B.
2011-01-01
An eighth-order filter method for a wide range of compressible flow speeds (H.C. Yee and B. Sjogreen, Proceedings of ICOSAHOM09, June 22-26, 2009, Trondheim, Norway) are employed for large eddy simulations (LES) of temporally evolving mixing layers (TML) for different convective Mach numbers (Mc) and Reynolds numbers. The high order filter method is designed for accurate and efficient simulations of shock-free compressible turbulence, turbulence with shocklets and turbulence with strong shocks with minimum tuning of scheme parameters. The value of Mc considered is for the TML range from the quasi-incompressible regime to the highly compressible supersonic regime. The three main characteristics of compressible TML (the self similarity property, compressibility effects and the presence of large-scale structure with shocklets for high Mc) are considered for the LES study. The LES results using the same scheme parameters for all studied cases agree well with experimental results of Barone et al. (2006), and published direct numerical simulations (DNS) work of Rogers & Moser (1994) and Pantano & Sarkar (2002).
Gorshkov, Anton V; Kirillin, Mikhail Yu
2015-08-01
Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.
NASA Astrophysics Data System (ADS)
Hsu, Hsiao-Ping; Huang, Aiqun; Bhattacharya, Aniket; Binder, Kurt
2015-03-01
In this talk we compare the results obtained from Monte Carlo (MC) and Brownian dynamics (BD) simulation for the universal properties of a semi-flexible chain. Specifically we compare MC results obtained using pruned-enriched Rosenbluth method (PERM) with those obtained from BD simulation. We find that the scaled plot of root-mean-square (RMS) end-to-end distance
Ion-mediated interactions in suspensions of oppositely charged nanoparticles
NASA Astrophysics Data System (ADS)
Dahirel, Vincent; Hansen, Jean Pierre
2009-08-01
The structure of oppositely charged spherical nanoparticles (polyions), dispersed in ionic solutions with continuous solvent (primitive model), is investigated by Monte Carlo (MC) simulations, within explicit and implicit microion representations, over a range of polyion valences and densities, and microion concentrations. Systems with explicit microions are explored by semigrand canonical MC simulations, and allow density-dependent effective polyion pair potentials vαβeff(r ) to be extracted from measured partial pair distribution functions. Implicit microion MC simulations are based on pair potentials of mean force vαβ(2)(r ) computed by explicit microion simulations of two charged polyions, in the low density limit. In the vicinity of the liquid-gas separation expected for oppositely charged polyions, the implicit microion representation leads to an instability against density fluctuations for polyion valences |Z| significantly below those at which the instability sets in within the exact explicit microion representation. Far from this instability region, the vαβ(2)(r ) are found to be fairly close to but consistently more repulsive than the effective pair potentials vαβeff(r ). This is corroborated by additional calculations of three-body forces between polyion triplets, which are repulsive when one polyion is of opposite charge to the other two. The explicit microion MC data were exploited to determine the ratio of salt concentrations c and co within the dispersion and the reservoir (Donnan effect). c /co is found to first increase before finally decreasing as a function of the polyion packing fraction.
Shielding evaluation for solar particle events using MCNPX, PHITS and OLTARIS codes.
Aghara, S K; Sriprisan, S I; Singleterry, R C; Sato, T
2015-01-01
Detailed analyses of Solar Particle Events (SPE) were performed to calculate primary and secondary particle spectra behind aluminum, at various thicknesses in water. The simulations were based on Monte Carlo (MC) radiation transport codes, MCNPX 2.7.0 and PHITS 2.64, and the space radiation analysis website called OLTARIS (On-Line Tool for the Assessment of Radiation in Space) version 3.4 (uses deterministic code, HZETRN, for transport). The study is set to investigate the impact of SPEs spectra transporting through 10 or 20 g/cm(2) Al shield followed by 30 g/cm(2) of water slab. Four historical SPE events were selected and used as input source spectra particle differential spectra for protons, neutrons, and photons are presented. The total particle fluence as a function of depth is presented. In addition to particle flux, the dose and dose equivalent values are calculated and compared between the codes and with the other published results. Overall, the particle fluence spectra from all three codes show good agreement with the MC codes showing closer agreement compared to the OLTARIS results. The neutron particle fluence from OLTARIS is lower than the results from MC codes at lower energies (E<100 MeV). Based on mean square difference analysis the results from MCNPX and PHITS agree better for fluence, dose and dose equivalent when compared to OLTARIS results. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borowik, Piotr, E-mail: pborow@poczta.onet.pl; Thobel, Jean-Luc, E-mail: jean-luc.thobel@iemn.univ-lille1.fr; Adamowicz, Leszek, E-mail: adamo@if.pw.edu.pl
Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport propertiesmore » of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.« less
Simulation - McCandless, Bruce (Syncom IV)
1985-04-15
S85-30800 (14 April 1985) --- Astronaut Bruce McCandless II tests one of the possible methods of attempting to activate a switch on the Syncom-IV (LEASAT) satellite released April 13 into space from the Space Shuttle Discovery. The communications spacecraft failed to behave properly upon release and NASA officials and satellite experts are considering possible means of repair. McCandless was using a full scale mockup of the satellite in the Johnson Space Center's (JSC) mockup and integration laboratory.
NASA Astrophysics Data System (ADS)
El Kanawati, W.; Létang, J. M.; Dauvergne, D.; Pinto, M.; Sarrut, D.; Testa, É.; Freud, N.
2015-10-01
A Monte Carlo (MC) variance reduction technique is developed for prompt-γ emitters calculations in proton therapy. Prompt-γ emitted through nuclear fragmentation reactions and exiting the patient during proton therapy could play an important role to help monitoring the treatment. However, the estimation of the number and the energy of emitted prompt-γ per primary proton with MC simulations is a slow process. In order to estimate the local distribution of prompt-γ emission in a volume of interest for a given proton beam of the treatment plan, a MC variance reduction technique based on a specific track length estimator (TLE) has been developed. First an elemental database of prompt-γ emission spectra is established in the clinical energy range of incident protons for all elements in the composition of human tissues. This database of the prompt-γ spectra is built offline with high statistics. Regarding the implementation of the prompt-γ TLE MC tally, each proton deposits along its track the expectation of the prompt-γ spectra from the database according to the proton kinetic energy and the local material composition. A detailed statistical study shows that the relative efficiency mainly depends on the geometrical distribution of the track length. Benchmarking of the proposed prompt-γ TLE MC technique with respect to an analogous MC technique is carried out. A large relative efficiency gain is reported, ca. 105.
Development of a Multi-Channel Piezoelectric Acoustic Sensor Based on an Artificial Basilar Membrane
Jung, Youngdo; Kwak, Jun-Hyuk; Lee, Young Hwa; Kim, Wan Doo; Hur, Shin
2014-01-01
In this research, we have developed a multi-channel piezoelectric acoustic sensor (McPAS) that mimics the function of the natural basilar membrane capable of separating incoming acoustic signals mechanically by their frequency and generating corresponding electrical signals. The McPAS operates without an external energy source and signal processing unit with a vibrating piezoelectric thin film membrane. The shape of the vibrating membrane was chosen to be trapezoidal such that different locations of membrane have different local resonance frequencies. The length of the membrane is 28 mm and the width of the membrane varies from 1 mm to 8 mm. Multiphysics finite element analysis (FEA) was carried out to predict and design the mechanical behaviors and piezoelectric response of the McPAS model. The designed McPAS was fabricated with a MEMS fabrication process based on the simulated results. The fabricated device was tested with a mouth simulator to measure its mechanical and piezoelectrical frequency response with a laser Doppler vibrometer and acoustic signal analyzer. The experimental results show that the as fabricated McPAS can successfully separate incoming acoustic signals within the 2.5 kHz–13.5 kHz range and the maximum electrical signal output upon acoustic signal input of 94 dBSPL was 6.33 mVpp. The performance of the fabricated McPAS coincided well with the designed parameters. PMID:24361926
Dose and scatter characteristics of a novel cone beam CT system for musculoskeletal extremities
NASA Astrophysics Data System (ADS)
Zbijewski, W.; Sisniega, A.; Vaquero, J. J.; Muhit, A.; Packard, N.; Senn, R.; Yang, D.; Yorkston, J.; Carrino, J. A.; Siewerdsen, J. H.
2012-03-01
A novel cone-beam CT (CBCT) system has been developed with promising capabilities for musculoskeletal imaging (e.g., weight-bearing extremities and combined radiographic / volumetric imaging). The prototype system demonstrates diagnostic-quality imaging performance, while the compact geometry and short scan orbit raise new considerations for scatter management and dose characterization that challenge conventional methods. The compact geometry leads to elevated, heterogeneous x-ray scatter distributions - even for small anatomical sites (e.g., knee or wrist), and the short scan orbit results in a non-uniform dose distribution. These complex dose and scatter distributions were investigated via experimental measurements and GPU-accelerated Monte Carlo (MC) simulation. The combination provided a powerful basis for characterizing dose distributions in patient-specific anatomy, investigating the benefits of an antiscatter grid, and examining distinct contributions of coherent and incoherent scatter in artifact correction. Measurements with a 16 cm CTDI phantom show that the dose from the short-scan orbit (0.09 mGy/mAs at isocenter) varies from 0.16 to 0.05 mGy/mAs at various locations on the periphery (all obtained at 80 kVp). MC estimation agreed with dose measurements within 10-15%. Dose distribution in patient-specific anatomy was computed with MC, confirming such heterogeneity and highlighting the elevated energy deposition in bone (factor of ~5-10) compared to soft-tissue. Scatter-to-primary ratio (SPR) up to ~1.5-2 was evident in some regions of the knee. A 10:1 antiscatter grid was found earlier to result in significant improvement in soft-tissue imaging performance without increase in dose. The results of MC simulations elucidated the mechanism behind scatter reduction in the presence of a grid. A ~3-fold reduction in average SPR was found in the MC simulations; however, a linear grid was found to impart additional heterogeneity in the scatter distribution, mainly due to the increase in the contribution of coherent scatter with increased spatial variation. Scatter correction using MC-generated scatter distributions demonstrated significant improvement in cupping and streaks. Physical experimentation combined with GPU-accelerated MC simulation provided a sophisticated, yet practical approach in identifying low-dose acquisition techniques, optimizing scatter correction methods, and evaluating patientspecific dose.
NASA Astrophysics Data System (ADS)
Guo, Liwen
The desire to create more complex visual scenes in modern flight simulators outpaces recent increases in processor speed. As a result, the simulation transport delay remains a problem. Because of the limitations shown in the three prominent existing delay compensators---the lead/lag filter, the McFarland compensator and the Sobiski/Cardullo predictor---new approaches of compensating the transport delay in a flight simulator have been developed. The first novel compensator is the adaptive predictor making use of the Kalman filter algorithm in a unique manner so that the predictor can provide accurately the desired amount of prediction, significantly reducing the large spikes caused by the McFarland predictor. Among several simplified online adaptive predictors it illustrates mathematically why the stochastic approximation algorithm achieves the best compensation results. A second novel approach employed a reference aircraft dynamics model to implement a state space predictor on a flight simulator. The practical implementation formed the filter state vector from the operator's control input and the aircraft states. The relationship between the reference model and the compensator performance was investigated in great detail, and the best performing reference model was selected for implementation in the final tests. Piloted simulation tests were conducted for assessing the effectiveness of the two novel compensators in comparison to the McFarland predictor and no compensation. Thirteen pilots with heterogeneous flight experience executed straight-in and offset approaches, at various delay configurations, on a flight simulator where different predictors were applied to compensate for transport delay. Four metrics---the glide slope and touchdown errors, power spectral density of the pilot control inputs, NASA Task Load Index, and Cooper-Harper rating on the handling qualities---were employed for the analyses. The overall analyses show that while the adaptive predictor results in slightly poorer compensation for short added delay (up to 48 ms) and better compensation for long added delay (up to 192 ms) than the McFarland compensator, the state space predictor is fairly superior for short delay and significantly superior for long delay to the McFarland compensator. The state space predictor also achieves better compensation than the adaptive predictor. The results of the evaluation on the effectiveness of these predictors in the piloted tests agree with those in the theoretical offline tests conducted with the recorded simulation aircraft states.
Clinical Evaluation of Young Children with the McCarthy Scales.
ERIC Educational Resources Information Center
Kaufman, Alan S.; Kaufman, Nadeen L.
The main goal of this book is to enable examiners to use the McCarthy Scales of Children's Abilities as a clinical tool for evaluating preschool and primary-grade children. However, before an examiner becomes too concerned with issues relating to test interpretation, he or she should first understand thoroughly certain basic considerations.…
ERIC Educational Resources Information Center
Shore, Bruce M.; Chichekian, Tanya; Syer, Cassidy A.; Aulls, Mark W.; Frederiksen, Carl H.
2012-01-01
Tools are needed to track the elements of students' successful engagement in inquiry. The "McGill Strategic Demands of Inquiry Questionnaire" (MSDIQ) is a 79-item, criterion-referenced, learner-focused questionnaire anchored in Schon's model and related models of self-regulated learning. The MSDIQ addresses three phases of inquiry…
NASA Astrophysics Data System (ADS)
Whittaker, Kara A.; McShane, Dan
2013-02-01
A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.
Development of Simulation Methods in the Gibbs Ensemble to Predict Polymer-Solvent Phase Equilibria
NASA Astrophysics Data System (ADS)
Gartner, Thomas; Epps, Thomas; Jayaraman, Arthi
Solvent vapor annealing (SVA) of polymer thin films is a promising method for post-deposition polymer film morphology control. The large number of important parameters relevant to SVA (polymer, solvent, and substrate chemistries, incoming film condition, annealing and solvent evaporation conditions) makes systematic experimental study of SVA a time-consuming endeavor, motivating the application of simulation and theory to the SVA system to provide both mechanistic insight and scans of this wide parameter space. However, to rigorously treat the phase equilibrium between polymer film and solvent vapor while still probing the dynamics of SVA, new simulation methods must be developed. In this presentation, we compare two methods to study polymer-solvent phase equilibrium-Gibbs Ensemble Molecular Dynamics (GEMD) and Hybrid Monte Carlo/Molecular Dynamics (Hybrid MC/MD). Liquid-vapor equilibrium results are presented for the Lennard Jones fluid and for coarse-grained polymer-solvent systems relevant to SVA. We found that the Hybrid MC/MD method is more stable and consistent than GEMD, but GEMD has significant advantages in computational efficiency. We propose that Hybrid MC/MD simulations be used for unfamiliar systems in certain choice conditions, followed by much faster GEMD simulations to map out the remainder of the phase window.
NASA Astrophysics Data System (ADS)
Jover, J.; Haslam, A. J.; Galindo, A.; Jackson, G.; Müller, E. A.
2012-10-01
We present a continuous pseudo-hard-sphere potential based on a cut-and-shifted Mie (generalized Lennard-Jones) potential with exponents (50, 49). Using this potential one can mimic the volumetric, structural, and dynamic properties of the discontinuous hard-sphere potential over the whole fluid range. The continuous pseudo potential has the advantage that it may be incorporated directly into off-the-shelf molecular-dynamics code, allowing the user to capitalise on existing hardware and software advances. Simulation results for the compressibility factor of the fluid and solid phases of our pseudo hard spheres are presented and compared both to the Carnahan-Starling equation of state of the fluid and published data, the differences being indistinguishable within simulation uncertainty. The specific form of the potential is employed to simulate flexible chains formed from these pseudo hard spheres at contact (pearl-necklace model) for mc = 4, 5, 7, 8, 16, 20, 100, 201, and 500 monomer segments. The compressibility factor of the chains per unit of monomer, mc, approaches a limiting value at reasonably small values, mc < 50, as predicted by Wertheim's first order thermodynamic perturbation theory. Simulation results are also presented for highly asymmetric mixtures of pseudo hard spheres, with diameter ratios of 3:1, 5:1, 20:1 over the whole composition range.
NASA Astrophysics Data System (ADS)
Aklan, B.; Jakoby, B. W.; Watson, C. C.; Braun, H.; Ritt, P.; Quick, H. H.
2015-06-01
A simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop an accurate Monte Carlo (MC) simulation of a fully integrated 3T PET/MR hybrid imaging system (Siemens Biograph mMR). The PET/MR components of the Biograph mMR were simulated in order to allow a detailed study of variations of the system design on the PET performance, which are not easy to access and measure on a real PET/MR system. The 3T static magnetic field of the MR system was taken into account in all Monte Carlo simulations. The validation of the MC model was carried out against actual measurements performed on the PET/MR system by following the NEMA (National Electrical Manufacturers Association) NU 2-2007 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction, and count rate capability. The validated system model was then used for two different applications. The first application focused on investigating the effect of an extension of the PET field-of-view on the PET performance of the PET/MR system. The second application deals with simulating a modified system timing resolution and coincidence time window of the PET detector electronics in order to simulate time-of-flight (TOF) PET detection. A dedicated phantom was modeled to investigate the impact of TOF on overall PET image quality. Simulation results showed that the overall divergence between simulated and measured data was found to be less than 10%. Varying the detector geometry showed that the system sensitivity and noise equivalent count rate of the PET/MR system increased progressively with an increasing number of axial detector block rings, as to be expected. TOF-based PET reconstructions of the modeled phantom showed an improvement in signal-to-noise ratio and image contrast to the conventional non-TOF PET reconstructions. In conclusion, the validated MC simulation model of an integrated PET/MR system with an overall accuracy error of less than 10% can now be used for further MC simulation applications such as development of hardware components as well as for testing of new PET/MR software algorithms, such as assessment of point-spread function-based reconstruction algorithms.
Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms
NASA Astrophysics Data System (ADS)
Papadimitroulas, P.; Kagadis, G. C.; Ploussi, A.; Kordolaimi, S.; Papamichail, D.; Karavasilis, E.; Syrgiamiotis, V.; Loudos, G.
2015-09-01
The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ˜1010 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition.
Characterization and Computational Modeling of Minor Phases in Alloy LSHR
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng; Olson, Gregory; Gabb, Timothy; Garg, Anita; Miller, Derek
2012-01-01
The minor phases of powder metallurgy disk superalloy LSHR were studied. Samples were consistently heat treated at three different temperatures for long times to approach equilibrium. Additional heat treatments were also performed for shorter times, to assess minor phase kinetics in non-equilibrium conditions. Minor phases including MC carbides, M23C6 carbides, M3B2 borides, and sigma were identified. Their average sizes and total area fractions were determined. CALPHAD thermodynamics databases and PrecipiCalc(TradeMark), a computational precipitation modeling tool, were employed with Ni-base thermodynamics and diffusion databases to model and simulate the phase microstructural evolution observed in the experiments with an objective to identify the model limitations and the directions of model enhancement.
Higo, Junichi; Umezawa, Koji
2014-01-01
We introduce computational studies on intrinsically disordered proteins (IDPs). Especially, we present our multicanonical molecular dynamics (McMD) simulations of two IDP-partner systems: NRSF-mSin3 and pKID-KIX. McMD is one of enhanced conformational sampling methods useful for conformational sampling of biomolecular systems. IDP adopts a specific tertiary structure upon binding to its partner molecule, although it is unstructured in the unbound state (i.e. the free state). This IDP-specific property is called "coupled folding and binding". The McMD simulation treats the biomolecules with an all-atom model immersed in an explicit solvent. In the initial configuration of simulation, IDP and its partner molecules are set to be distant from each other, and the IDP conformation is disordered. The computationally obtained free-energy landscape for coupled folding and binding has shown that native- and non-native-complex clusters distribute complicatedly in the conformational space. The all-atom simulation suggests that both of induced-folding and population-selection are coupled complicatedly in the coupled folding and binding. Further analyses have exemplified that the conformational fluctuations (dynamical flexibility) in the bound and unbound states are essentially important to characterize IDP functioning.
Parallel Grand Canonical Monte Carlo (ParaGrandMC) Simulation Code
NASA Technical Reports Server (NTRS)
Yamakov, Vesselin I.
2016-01-01
This report provides an overview of the Parallel Grand Canonical Monte Carlo (ParaGrandMC) simulation code. This is a highly scalable parallel FORTRAN code for simulating the thermodynamic evolution of metal alloy systems at the atomic level, and predicting the thermodynamic state, phase diagram, chemical composition and mechanical properties. The code is designed to simulate multi-component alloy systems, predict solid-state phase transformations such as austenite-martensite transformations, precipitate formation, recrystallization, capillary effects at interfaces, surface absorption, etc., which can aid the design of novel metallic alloys. While the software is mainly tailored for modeling metal alloys, it can also be used for other types of solid-state systems, and to some degree for liquid or gaseous systems, including multiphase systems forming solid-liquid-gas interfaces.
Study on photon transport problem based on the platform of molecular optical simulation environment.
Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie
2010-01-01
As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SP(n)), and physical measurement to verify the performance of our study method on both accuracy and efficiency.
Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment
Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie
2010-01-01
As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (S P n), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737
Raman Monte Carlo simulation for light propagation for tissue with embedded objects
NASA Astrophysics Data System (ADS)
Periyasamy, Vijitha; Jaafar, Humaira Bte; Pramanik, Manojit
2018-02-01
Monte Carlo (MC) stimulation is one of the prominent simulation technique and is rapidly becoming the model of choice to study light-tissue interaction. Monte Carlo simulation for light transport in multi-layered tissue (MCML) is adapted and modelled with different geometry by integrating embedded objects of various shapes (i.e., sphere, cylinder, cuboid and ellipsoid) into the multi-layered structure. These geometries would be useful in providing a realistic tissue structure such as modelling for lymph nodes, tumors, blood vessels, head and other simulation medium. MC simulations were performed on various geometric medium. Simulation of MCML with embedded object (MCML-EO) was improvised for propagation of the photon in the defined medium with Raman scattering. The location of Raman photon generation is recorded. Simulations were experimented on a modelled breast tissue with tumor (spherical and ellipsoidal) and blood vessels (cylindrical). Results were presented in both A-line and B-line scans for embedded objects to determine spatial location where Raman photons were generated. Studies were done for different Raman probabilities.
Formalization and Validation of an SADT Specification Through Executable Simulation in VHDL
1991-12-01
be found in (39, 40, 41). One recent summary of the SADT methodology was written by Marca and McGowan in 1988 (.32). SADT is a methodology to provide...that is required. Also, the presence of "all" inputs and controls may not be needed for the activity to proceed. Marca and McGowan (32) describe a...diagrams which describe a complete system. Marca and McGowan define an SADT Model as: "a collection of carefully coorinated descriptions, starting from a
Astronaut William McArthur prepares for a training exercise
1993-07-20
S93-38686 (20 July 1993) --- Wearing a training version of the partial pressure launch and entry garment, astronaut William S. McArthur prepares to rehearse emergency egress procedures for the STS-58 mission. McArthur, along with the five other NASA astronauts and a visiting payload specialist assigned to the seven-member crew, later simulated contingency evacuation procedures. Most of the training session took place in the crew compartment and full fuselage trainers of the Space Shuttle mockup and integration laboratory.
Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2017-01-07
Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6 ± 15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.
NASA Astrophysics Data System (ADS)
Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2017-01-01
Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6 ± 15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.
Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2016-01-01
Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6±15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size. PMID:27991456
Orthogonal Multi-Carrier DS-CDMA with Frequency-Domain Equalization
NASA Astrophysics Data System (ADS)
Tanaka, Ken; Tomeba, Hiromichi; Adachi, Fumiyuki
Orthogonal multi-carrier direct sequence code division multiple access (orthogonal MC DS-CDMA) is a combination of orthogonal frequency division multiplexing (OFDM) and time-domain spreading, while multi-carrier code division multiple access (MC-CDMA) is a combination of OFDM and frequency-domain spreading. In MC-CDMA, a good bit error rate (BER) performance can be achieved by using frequency-domain equalization (FDE), since the frequency diversity gain is obtained. On the other hand, the conventional orthogonal MC DS-CDMA fails to achieve any frequency diversity gain. In this paper, we propose a new orthogonal MC DS-CDMA that can obtain the frequency diversity gain by applying FDE. The conditional BER analysis is presented. The theoretical average BER performance in a frequency-selective Rayleigh fading channel is evaluated by the Monte-Carlo numerical computation method using the derived conditional BER and is confirmed by computer simulation of the orthogonal MC DS-CDMA signal transmission.
Extreme Underwater Mission on This Week @NASA – July 29, 2016
2016-07-29
The 21st NASA Extreme Environment Mission Operations got underway July 21 in the Florida Keys. NASA astronauts Reid Wiseman and Megan McArthur are part of the international crew of NEEMO-21 aquanauts performing research during the 16-day mission, which takes place about 60 feet below the surface of the Atlantic Ocean, in the Aquarius habitat – the world's only undersea science station. Simulated spacewalks are designed to evaluate tools and mission operation techniques that could be used on future space missions. NEEMO-21’s objectives include testing a mini DNA sequencer similar to the one NASA astronaut Kate Rubins also will test aboard the International Space Station, and a telemedicine device that will be used for future space applications. The mission also will simulate communications delays like those that would be encountered on a mission to Mars. Also, Space Launch System Work Platforms, All-Electric X-Plane Arrives, Asteroid Mission Technology, and NASA @Comic-Con International.
The MCUCN simulation code for ultracold neutron physics
NASA Astrophysics Data System (ADS)
Zsigmond, G.
2018-02-01
Ultracold neutrons (UCN) have very low kinetic energies 0-300 neV, thereby can be stored in specific material or magnetic confinements for many hundreds of seconds. This makes them a very useful tool in probing fundamental symmetries of nature (for instance charge-parity violation by neutron electric dipole moment experiments) and contributing important parameters for the Big Bang nucleosynthesis (neutron lifetime measurements). Improved precision experiments are in construction at new and planned UCN sources around the world. MC simulations play an important role in the optimization of such systems with a large number of parameters, but also in the estimation of systematic effects, in benchmarking of analysis codes, or as part of the analysis. The MCUCN code written at PSI has been extensively used for the optimization of the UCN source optics and in the optimization and analysis of (test) experiments within the nEDM project based at PSI. In this paper we present the main features of MCUCN and interesting benchmark and application examples.
Box compression analysis of world-wide data spanning 46 years
Thomas J. Urbanik; Benjamin Frank
2006-01-01
The state of the art among most industry citations of box compression estimation is the equation by McKee developed in 1963. Because of limitations in computing tools at the time the McKee equation was developed, the equation is a simplification, with many constraints, of a more general relationship. By applying the results of sophisticated finite element modeling, in...
NASA Astrophysics Data System (ADS)
Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek
2017-07-01
Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron-electron (e-e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e-e interactions. This required adapting the treatment of e-e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.
STS-31 MS McCandless and MS Sullivan during JSC WETF underwater simulation
1990-03-05
This overall view shows STS-31 Mission Specialist (MS) Bruce McCandless II (left) and MS Kathryn D. Sullivan making a practice space walk in JSC's Weightless Environment Training Facility (WETF) Bldg 29 pool. McCandless works with a mockup of the remote manipulator system (RMS) end effector which is attached to a grapple fixture on the Hubble Space Telescope (HST) mockup. Sullivan manipulates HST hardware on the Support System Module (SSM) forward shell. SCUBA-equipped divers monitor the extravehicular mobility unit (EMU) suited crewmembers during this simulated extravehicular activity (EVA). No EVA is planned for the Hubble Space Telescope (HST) deployment, but the duo has trained for contingencies which might arise during the STS-31 mission aboard Discovery, Orbiter Vehicle (OV) 103. Photo taken by NASA JSC photographer Sheri Dunnette.
STS-31 MS McCandless and MS Sullivan during JSC WETF underwater simulation
NASA Technical Reports Server (NTRS)
1990-01-01
This overall view shows STS-31 Mission Specialist (MS) Bruce McCandless II (left) and MS Kathryn D. Sullivan making a practice space walk in JSC's Weightless Environment Training Facility (WETF) Bldg 29 pool. McCandless works with a mockup of the remote manipulator system (RMS) end effector which is attached to a grapple fixture on the Hubble Space Telescope (HST) mockup. Sullivan manipulates HST hardware on the Support System Module (SSM) forward shell. SCUBA-equipped divers monitor the extravehicular mobility unit (EMU) suited crewmembers during this simulated extravehicular activity (EVA). No EVA is planned for the Hubble Space Telescope (HST) deployment, but the duo has trained for contingencies which might arise during the STS-31 mission aboard Discovery, Orbiter Vehicle (OV) 103. Photo taken by NASA JSC photographer Sheri Dunnette.
NASA Astrophysics Data System (ADS)
Drapek, R. J.; Kim, J. B.
2013-12-01
We simulated ecosystem response to climate change in the USA and Canada at a 5 arc-minute grid resolution using the MC1 dynamic global vegetation model and nine CMIP3 future climate projections as input. The climate projections were produced by 3 GCMs simulating 3 SRES emissions scenarios. We examined MC1 outputs for the conterminous USA by summarizing them by EPA level II and III ecoregions to characterize model skill and evaluate the magnitude and uncertainties of simulated ecosystem response to climate change. First, we evaluated model skill by comparing outputs from the recent historical period with benchmark datasets. Distribution of potential natural vegetation simulated by MC1 was compared with Kuchler's map. Above ground live carbon simulated by MC1 was compared with the National Biomass and Carbon Dataset. Fire return intervals calculated by MC1 were compared with maximum and minimum values compiled for the United States. Each EPA Level III Ecoregion was scored for average agreement with corresponding benchmark data and an average score was calculated for all three types of output. Greatest agreement with benchmark data happened in the Western Cordillera, the Ozark / Ouachita-Appalachian Forests, and the Southeastern USA Plains (EPA Level II Ecoregions). The lowest agreement happened in the Everglades and the Tamaulipas-Texas Semiarid Plain. For simulated ecosystem response to future climate projections we examined MC1 output for shifts in vegetation type, vegetation carbon, runoff, and biomass consumed by fire. Each ecoregion was scored for the amount of change from historical conditions for each variable and an average score was calculated. Smallest changes were forecast for Western Cordillera and Marine West Coast Forest ecosystems. Largest changes were forecast for the Cold Deserts, the Mixed Wood Plains, and the Central USA Plains. By combining scores of model skill for the historical period for each EPA Level 3 Ecoregion with scores representing the magnitude of ecosystem changes in the future, we identified high and low uncertainty ecoregions. The largest anticipated changes and the lowest measures of model skill coincide in the Central USA Plains and the Mixed Wood Plains. The combination of low model skill and high degree of ecosystem change elevate the importance of our uncertainty in this ecoregion. The highest projected changes coincide with relatively high model skill in the Cold Deserts. Climate adaptation efforts are the most likely to pay off in these regions. Finally, highest model skill and lowest anticipated changes coincide in the Western Cordillera and the Marine West Coast Forests. These regions may be relatively low-risk for climate change impacts when compared to the other ecoregions. These results represent only the first step in this type of analysis; there exist many ways to strengthen it. One, MC1 calibrations can be optimized using a structured optimization technique. Two, a larger set of climate projections can be used to capture a fuller range of GCMs and emissions scenarios. And three, employing an ensemble of vegetation models would make the analysis more robust.
NASA Astrophysics Data System (ADS)
Kerns, B. K.; Kim, J. B.; Day, M. A.; Pitts, B.; Drapek, R. J.
2017-12-01
Ecosystem process models are increasingly being used in regional assessments to explore potential changes in future vegetation and NPP due to climate change. We use the dynamic global vegetation model MAPSS-Century 2 (MC2) as one line of evidence for regional climate change vulnerability assessments for the US Forest Service, focusing our fine tuning model calibration from observational sources related to forest vegetation. However, there is much interest in understanding projected changes for arid rangelands in the western US such as grasslands, shrublands, and woodlands. Rangelands provide many ecosystem service benefits and local rural human community sustainability, habitat for threatened and endangered species, and are threatened by annual grass invasion. Past work suggested MC2 performance related to arid rangeland plant functional types (PFT's) was poor, and the model has difficulty distinguishing annual versus perennial grasslands. Our objectives are to increase the model performance for rangeland simulations and explore the potential for splitting the grass plant functional type into annual and perennial. We used the tri-state Blue Mountain Ecoregion as our study area and maps of potential vegetation from interpolated ground data, the National Land Cover Data Database, and ancillary NPP data derived from the MODIS satellite. MC2 historical simulations for the area overestimated woodland occurrence and underestimated shrubland and grassland PFT's. The spatial location of the rangeland PFT's also often did not align well with observational data. While some disagreement may be due to differences in the respective classification rules, the errors are largely linked to MC2's tree and grass biogeography and physiology algorithms. Presently, only grass and forest productivity measures and carbon stocks are used to distinguish PFT's. MC2 grass and tree productivity simulation is problematic, in particular grass seasonal phenology in relation to seasonal patterns of temperature and precipitation. The algorithm also does not accurately translate simulated carbon stocks into the canopy allometry of woodland tree species that dominate the BME, thereby inaccurately shading out the grasses in the understory. We are devising improvements to these shortcomings in the model architecture.
SU-E-T-155: Calibration of Variable Longitudinal Strength 103Pd Brachytherapy Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, J; Radtke, J; Micka, J
Purpose: Brachytherapy sources with variable longitudinal strength (VLS) allow for a customized intensity along the length of the source. These have applications in focal brachytherapy treatments of prostate cancer where dose boosting can be achieved through modulation of intra-source strengths. This work focused on development of a calibration methodology for VLS sources based on measurements and Monte Carlo (MC) simulations of five 1 cm {sup 10} {sup 3}Pd sources each containing four regions of variable {sup 103}Pd strength. Methods: The air-kerma strengths of the sources were measured with a variable-aperture free-air chamber (VAFAC). Source strengths were also measured using amore » well chamber. The in-air azimuthal and polar anisotropy of the sources were measured by rotating them in front of a NaI scintillation detector and were calculated with MC simulations. Azimuthal anisotropy results were normalized to their mean intensity values. Polar anisotropy results were normalized to their average transverse axis intensity values. The relative longitudinal strengths of the sources were measured via on-contact irradiations with radiochromic film, and were calculated with MC simulations. Results: The variable {sup 103}Pd loading of the sources was validated by VAFAC and well chamber measurements. Ratios of VAFAC air-kerma strengths and well chamber responses were within ±1.3% for all sources. Azimuthal anisotropy results indicated that ≥95% of the normalized values for all sources were within ±1.7% of the mean values. Polar anisotropy results indicated variations within ±0.3% for a ±7.6° angular region with respect to the source transverse axis. Locations and intensities of the {sup 103}Pd regions were validated by radiochromic film measurements and MC simulations. Conclusion: The calibration methodology developed in this work confirms that the VLS sources investigated have a high level of polar uniformity, and that the strength and longitudinal intensity can be verified experimentally and through MC simulations. {sup 103}Pd sources were provided by CivaTech Oncology, Inc.« less
A fragment-based approach to the SAMPL3 Challenge
NASA Astrophysics Data System (ADS)
Kulp, John L.; Blumenthal, Seth N.; Wang, Qiang; Bryan, Richard L.; Guarnieri, Frank
2012-05-01
The success of molecular fragment-based design depends critically on the ability to make predictions of binding poses and of affinity ranking for compounds assembled by linking fragments. The SAMPL3 Challenge provides a unique opportunity to evaluate the performance of a state-of-the-art fragment-based design methodology with respect to these requirements. In this article, we present results derived from linking fragments to predict affinity and pose in the SAMPL3 Challenge. The goal is to demonstrate how incorporating different aspects of modeling protein-ligand interactions impact the accuracy of the predictions, including protein dielectric models, charged versus neutral ligands, ΔΔGs solvation energies, and induced conformational stress. The core method is based on annealing of chemical potential in a Grand Canonical Monte Carlo (GC/MC) simulation. By imposing an initially very high chemical potential and then automatically running a sequence of simulations at successively decreasing chemical potentials, the GC/MC simulation efficiently discovers statistical distributions of bound fragment locations and orientations not found reliably without the annealing. This method accounts for configurational entropy, the role of bound water molecules, and results in a prediction of all the locations on the protein that have any affinity for the fragment. Disregarding any of these factors in affinity-rank prediction leads to significantly worse correlation with experimentally-determined free energies of binding. We relate three important conclusions from this challenge as applied to GC/MC: (1) modeling neutral ligands—regardless of the charged state in the active site—produced better affinity ranking than using charged ligands, although, in both cases, the poses were almost exactly overlaid; (2) simulating explicit water molecules in the GC/MC gave better affinity and pose predictions; and (3) applying a ΔΔGs solvation correction further improved the ranking of the neutral ligands. Using the GC/MC method under a variety of parameters in the blinded SAMPL3 Challenge provided important insights to the relevant parameters and boundaries in predicting binding affinities using simulated annealing of chemical potential calculations.
NASA Astrophysics Data System (ADS)
Bellos, Vasilis; Tsakiris, George
2016-09-01
The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.
Simulation and analysis of a proposed replacement for the McCook port of entry inspection station
DOT National Transportation Integrated Search
1999-04-01
This report describes a study of a proposed replacement for the McCook Port of Entry inspection station at the entry to South Dakota. In order to assess the potential for a low-speed weigh in motion (WIM) scale within the station to pre-screen trucks...
Using Computer-Based "Experiments" in the Analysis of Chemical Reaction Equilibria
ERIC Educational Resources Information Center
Li, Zhao; Corti, David S.
2018-01-01
The application of the Reaction Monte Carlo (RxMC) algorithm to standard textbook problems in chemical reaction equilibria is discussed. The RxMC method is a molecular simulation algorithm for studying the equilibrium properties of reactive systems, and therefore provides the opportunity to develop computer-based "experiments" for the…
Correction for human head motion in helical x-ray CT
NASA Astrophysics Data System (ADS)
Kim, J.-H.; Sun, T.; Alcheikh, A. R.; Kuncic, Z.; Nuyts, J.; Fulton, R.
2016-02-01
Correction for rigid object motion in helical CT can be achieved by reconstructing from a modified source-detector orbit, determined by the object motion during the scan. This ensures that all projections are consistent, but it does not guarantee that the projections are complete in the sense of being sufficient for exact reconstruction. We have previously shown with phantom measurements that motion-corrected helical CT scans can suffer from data-insufficiency, in particular for severe motions and at high pitch. To study whether such data-insufficiency artefacts could also affect the motion-corrected CT images of patients undergoing head CT scans, we used an optical motion tracking system to record the head movements of 10 healthy volunteers while they executed each of the 4 different types of motion (‘no’, slight, moderate and severe) for 60 s. From these data we simulated 354 motion-affected CT scans of a voxelized human head phantom and reconstructed them with and without motion correction. For each simulation, motion-corrected (MC) images were compared with the motion-free reference, by visual inspection and with quantitative similarity metrics. Motion correction improved similarity metrics in all simulations. Of the 270 simulations performed with moderate or less motion, only 2 resulted in visible residual artefacts in the MC images. The maximum range of motion in these simulations would encompass that encountered in the vast majority of clinical scans. With severe motion, residual artefacts were observed in about 60% of the simulations. We also evaluated a new method of mapping local data sufficiency based on the degree to which Tuy’s condition is locally satisfied, and observed that areas with high Tuy values corresponded to the locations of residual artefacts in the MC images. We conclude that our method can provide accurate and artefact-free MC images with most types of head motion likely to be encountered in CT imaging, provided that the motion can be accurately determined.
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-07
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0.69-1.23 times for photon only transport.
Constant-pH Hybrid Nonequilibrium Molecular Dynamics–Monte Carlo Simulation Method
2016-01-01
A computational method is developed to carry out explicit solvent simulations of complex molecular systems under conditions of constant pH. In constant-pH simulations, preidentified ionizable sites are allowed to spontaneously protonate and deprotonate as a function of time in response to the environment and the imposed pH. The method, based on a hybrid scheme originally proposed by H. A. Stern (J. Chem. Phys.2007, 126, 164112), consists of carrying out short nonequilibrium molecular dynamics (neMD) switching trajectories to generate physically plausible configurations with changed protonation states that are subsequently accepted or rejected according to a Metropolis Monte Carlo (MC) criterion. To ensure microscopic detailed balance arising from such nonequilibrium switches, the atomic momenta are altered according to the symmetric two-ends momentum reversal prescription. To achieve higher efficiency, the original neMD–MC scheme is separated into two steps, reducing the need for generating a large number of unproductive and costly nonequilibrium trajectories. In the first step, the protonation state of a site is randomly attributed via a Metropolis MC process on the basis of an intrinsic pKa; an attempted nonequilibrium switch is generated only if this change in protonation state is accepted. This hybrid two-step inherent pKa neMD–MC simulation method is tested with single amino acids in solution (Asp, Glu, and His) and then applied to turkey ovomucoid third domain and hen egg-white lysozyme. Because of the simple linear increase in the computational cost relative to the number of titratable sites, the present method is naturally able to treat extremely large systems. PMID:26300709
NASA Astrophysics Data System (ADS)
Xiong, Ming; Zheng, Huinan; Wu, S. T.; Wang, Yuming; Wang, Shui
2007-11-01
Numerical studies of the interplanetary "multiple magnetic clouds (Multi-MC)" are performed by a 2.5-dimensional ideal magnetohydrodynamic (MHD) model in the heliospheric meridional plane. Both slow MC1 and fast MC2 are initially emerged along the heliospheric equator, one after another with different time intervals. The coupling of two MCs could be considered as the comprehensive interaction between two systems, each comprising of an MC body and its driven shock. The MC2-driven shock and MC2 body are successively involved into interaction with MC1 body. The momentum is transferred from MC2 to MC1. After the passage of MC2-driven shock front, magnetic field lines in MC1 medium previously compressed by MC2-driven shock are prevented from being restored by the MC2 body pushing. MC1 body undergoes the most violent compression from the ambient solar wind ahead, continuous penetration of MC2-driven shock through MC1 body, and persistent pushing of MC2 body at MC1 tail boundary. As the evolution proceeds, the MC1 body suffers from larger and larger compression, and its original vulnerable magnetic elasticity becomes stiffer and stiffer. So there exists a maximum compressibility of Multi-MC when the accumulated elasticity can balance the external compression. This cutoff limit of compressibility mainly decides the maximally available geoeffectiveness of Multi-MC because the geoeffectiveness enhancement of MCs interacting is ascribed to the compression. Particularly, the greatest geoeffectiveness is excited among all combinations of each MC helicity, if magnetic field lines in the interacting region of Multi-MC are all southward. Multi-MC completes its final evolutionary stage when the MC2-driven shock is merged with MC1-driven shock into a stronger compound shock. With respect to Multi-MC geoeffectiveness, the evolution stage is a dominant factor, whereas the collision intensity is a subordinate one. The magnetic elasticity, magnetic helicity of each MC, and compression between each other are the key physical factors for the formation, propagation, evolution, and resulting geoeffectiveness of interplanetary Multi-MC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, X; Gao, H; Schuemann, J
2015-06-15
Purpose: The Monte Carlo (MC) method is a gold standard for dose calculation in radiotherapy. However, it is not a priori clear how many particles need to be simulated to achieve a given dose accuracy. Prior error estimate and stopping criterion are not well established for MC. This work aims to fill this gap. Methods: Due to the statistical nature of MC, our approach is based on one-sample t-test. We design the prior error estimate method based on the t-test, and then use this t-test based error estimate for developing a simulation stopping criterion. The three major components are asmore » follows.First, the source particles are randomized in energy, space and angle, so that the dose deposition from a particle to the voxel is independent and identically distributed (i.i.d.).Second, a sample under consideration in the t-test is the mean value of dose deposition to the voxel by sufficiently large number of source particles. Then according to central limit theorem, the sample as the mean value of i.i.d. variables is normally distributed with the expectation equal to the true deposited dose.Third, the t-test is performed with the null hypothesis that the difference between sample expectation (the same as true deposited dose) and on-the-fly calculated mean sample dose from MC is larger than a given error threshold, in addition to which users have the freedom to specify confidence probability and region of interest in the t-test based stopping criterion. Results: The method is validated for proton dose calculation. The difference between the MC Result based on the t-test prior error estimate and the statistical Result by repeating numerous MC simulations is within 1%. Conclusion: The t-test based prior error estimate and stopping criterion are developed for MC and validated for proton dose calculation. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
On the definition of a Monte Carlo model for binary crystal growth.
Los, J H; van Enckevort, W J P; Meekes, H; Vlieg, E
2007-02-01
We show that consistency of the transition probabilities in a lattice Monte Carlo (MC) model for binary crystal growth with the thermodynamic properties of a system does not guarantee the MC simulations near equilibrium to be in agreement with the thermodynamic equilibrium phase diagram for that system. The deviations remain small for systems with small bond energies, but they can increase significantly for systems with large melting entropy, typical for molecular systems. These deviations are attributed to the surface kinetics, which is responsible for a metastable zone below the liquidus line where no growth occurs, even in the absence of a 2D nucleation barrier. Here we propose an extension of the MC model that introduces a freedom of choice in the transition probabilities while staying within the thermodynamic constraints. This freedom can be used to eliminate the discrepancy between the MC simulations and the thermodynamic equilibrium phase diagram. Agreement is achieved for that choice of the transition probabilities yielding the fastest decrease of the free energy (i.e., largest growth rate) of the system at a temperature slightly below the equilibrium temperature. An analytical model is developed, which reproduces quite well the MC results, enabling a straightforward determination of the optimal set of transition probabilities. Application of both the MC and analytical model to conditions well away from equilibrium, giving rise to kinetic phase diagrams, shows that the effect of kinetics on segregation is even stronger than that predicted by previous models.
Singh, Kunwar; Tiwari, Satish Chandra; Gupta, Maneesha
2014-01-01
The paper introduces novel architectures for implementation of fully static master-slave flip-flops for low power, high performance, and high density. Based on the proposed structure, traditional C(2)MOS latch (tristate inverter/clocked inverter) based flip-flop is implemented with fewer transistors. The modified C(2)MOS based flip-flop designs mC(2)MOSff1 and mC(2)MOSff2 are realized using only sixteen transistors each while the number of clocked transistors is also reduced in case of mC(2)MOSff1. Postlayout simulations indicate that mC(2)MOSff1 flip-flop shows 12.4% improvement in PDAP (power-delay-area product) when compared with transmission gate flip-flop (TGFF) at 16X capacitive load which is considered to be the best design alternative among the conventional master-slave flip-flops. To validate the correct behaviour of the proposed design, an eight bit asynchronous counter is designed to layout level. LVS and parasitic extraction were carried out on Calibre, whereas layouts were implemented using IC station (Mentor Graphics). HSPICE simulations were used to characterize the transient response of the flip-flop designs in a 180 nm/1.8 V CMOS technology. Simulations were also performed at 130 nm, 90 nm, and 65 nm to reveal the scalability of both the designs at modern process nodes.
Tiwari, Satish Chandra; Gupta, Maneesha
2014-01-01
The paper introduces novel architectures for implementation of fully static master-slave flip-flops for low power, high performance, and high density. Based on the proposed structure, traditional C2MOS latch (tristate inverter/clocked inverter) based flip-flop is implemented with fewer transistors. The modified C2MOS based flip-flop designs mC2MOSff1 and mC2MOSff2 are realized using only sixteen transistors each while the number of clocked transistors is also reduced in case of mC2MOSff1. Postlayout simulations indicate that mC2MOSff1 flip-flop shows 12.4% improvement in PDAP (power-delay-area product) when compared with transmission gate flip-flop (TGFF) at 16X capacitive load which is considered to be the best design alternative among the conventional master-slave flip-flops. To validate the correct behaviour of the proposed design, an eight bit asynchronous counter is designed to layout level. LVS and parasitic extraction were carried out on Calibre, whereas layouts were implemented using IC station (Mentor Graphics). HSPICE simulations were used to characterize the transient response of the flip-flop designs in a 180 nm/1.8 V CMOS technology. Simulations were also performed at 130 nm, 90 nm, and 65 nm to reveal the scalability of both the designs at modern process nodes. PMID:24723808
SU-E-T-503: IMRT Optimization Using Monte Carlo Dose Engine: The Effect of Statistical Uncertainty.
Tian, Z; Jia, X; Graves, Y; Uribe-Sanchez, A; Jiang, S
2012-06-01
With the development of ultra-fast GPU-based Monte Carlo (MC) dose engine, it becomes clinically realistic to compute the dose-deposition coefficients (DDC) for IMRT optimization using MC simulation. However, it is still time-consuming if we want to compute DDC with small statistical uncertainty. This work studies the effects of the statistical error in DDC matrix on IMRT optimization. The MC-computed DDC matrices are simulated here by adding statistical uncertainties at a desired level to the ones generated with a finite-size pencil beam algorithm. A statistical uncertainty model for MC dose calculation is employed. We adopt a penalty-based quadratic optimization model and gradient descent method to optimize fluence map and then recalculate the corresponding actual dose distribution using the noise-free DDC matrix. The impacts of DDC noise are assessed in terms of the deviation of the resulted dose distributions. We have also used a stochastic perturbation theory to theoretically estimate the statistical errors of dose distributions on a simplified optimization model. A head-and-neck case is used to investigate the perturbation to IMRT plan due to MC's statistical uncertainty. The relative errors of the final dose distributions of the optimized IMRT are found to be much smaller than those in the DDC matrix, which is consistent with our theoretical estimation. When history number is decreased from 108 to 106, the dose-volume-histograms are still very similar to the error-free DVHs while the error in DDC is about 3.8%. The results illustrate that the statistical errors in the DDC matrix have a relatively small effect on IMRT optimization in dose domain. This indicates we can use relatively small number of histories to obtain the DDC matrix with MC simulation within a reasonable amount of time, without considerably compromising the accuracy of the optimized treatment plan. This work is supported by Varian Medical Systems through a Master Research Agreement. © 2012 American Association of Physicists in Medicine.
MC3: Multi-core Markov-chain Monte Carlo code
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Lust, Nate; Foster, AJ; Stemm, Madison; Loredo, Tom; Stevenson, Kevin; Campo, Chris; Hardin, Matt; Hardy, Ryan
2016-10-01
MC3 (Multi-core Markov-chain Monte Carlo) is a Bayesian statistics tool that can be executed from the shell prompt or interactively through the Python interpreter with single- or multiple-CPU parallel computing. It offers Markov-chain Monte Carlo (MCMC) posterior-distribution sampling for several algorithms, Levenberg-Marquardt least-squares optimization, and uniform non-informative, Jeffreys non-informative, or Gaussian-informative priors. MC3 can share the same value among multiple parameters and fix the value of parameters to constant values, and offers Gelman-Rubin convergence testing and correlated-noise estimation with time-averaging or wavelet-based likelihood estimation methods.
Charge Structure and Counterion Distribution in Hexagonal DNA Liquid Crystal
Dai, Liang; Mu, Yuguang; Nordenskiöld, Lars; Lapp, Alain; van der Maarel, Johan R. C.
2007-01-01
A hexagonal liquid crystal of DNA fragments (double-stranded, 150 basepairs) with tetramethylammonium (TMA) counterions was investigated with small angle neutron scattering (SANS). We obtained the structure factors pertaining to the DNA and counterion density correlations with contrast matching in the water. Molecular dynamics (MD) computer simulation of a hexagonal assembly of nine DNA molecules showed that the inter-DNA distance fluctuates with a correlation time around 2 ns and a standard deviation of 8.5% of the interaxial spacing. The MD simulation also showed a minimal effect of the fluctuations in inter-DNA distance on the radial counterion density profile and significant penetration of the grooves by TMA. The radial density profile of the counterions was also obtained from a Monte Carlo (MC) computer simulation of a hexagonal array of charged rods with fixed interaxial spacing. Strong ordering of the counterions between the DNA molecules and the absence of charge fluctuations at longer wavelengths was shown by the SANS number and charge structure factors. The DNA-counterion and counterion structure factors are interpreted with the correlation functions derived from the Poisson-Boltzmann equation, MD, and MC simulation. Best agreement is observed between the experimental structure factors and the prediction based on the Poisson-Boltzmann equation and/or MC simulation. The SANS results show that TMA is too large to penetrate the grooves to a significant extent, in contrast to what is shown by MD simulation. PMID:17098791
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papadimitroulas, P; Kostou, T; Kagadis, G
Purpose: The purpose of the present study was to quantify, evaluate the impact of cardiac and respiratory motion on clinical nuclear imaging protocols. Common SPECT and scintigraphic scans are studied using Monte Carlo (MC) simulations, comparing the resulted images with and without motion. Methods: Realistic simulations were executed using the GATE toolkit and the XCAT anthropomorphic phantom as a reference model for human anatomy. Three different radiopharmaceuticals based on 99mTc were studied, namely 99mTc-MDP, 99mTc—N—DBODC and 99mTc—DTPA-aerosol for bone, myocardium and lung scanning respectively. The resolution of the phantom was set to 3.5 mm{sup 3}. The impact of the motionmore » on spatial resolution was quantified using a sphere with 3.5 mm diameter and 10 separate time frames, in the ECAM modeled SPECT scanner. Finally, respiratory motion impact on resolution and imaging of lung lesions was investigated. The MLEM algorithm was used for data reconstruction, while the literature derived biodistributions of the pharmaceuticals were used as activity maps in the simulations. Results: FWHM was extracted for a static and a moving sphere which was ∼23 cm away from the entrance of the SPECT head. The difference in the FWHM was 20% between the two simulations. Profiles in thorax were compared in the case of bone scintigraphy, showing displacement and blurring of the bones when respiratory motion was inserted in the simulation. Large discrepancies were noticed in the case of myocardium imaging when cardiac motion was incorporated during the SPECT acquisition. Finally the borders of the lungs are blurred when respiratory motion is included resulting to a dislocation of ∼2.5 cm. Conclusion: As we move to individualized imaging and therapy procedures, quantitative and qualitative imaging is of high importance in nuclear diagnosis. MC simulations combined with anthropomorphic digital phantoms can provide an accurate tool for applications like motion correction techniques’ optimization. This research has been co-funded by the European Union (European Social Fund) and Greek national resources under the framework of the ‘Archimedes III: Funding of Research Groups in TEI of Athens’ project of the ‘Education & Lifelong Learning’ Operational Programme.« less
Jover, J; Haslam, A J; Galindo, A; Jackson, G; Müller, E A
2012-10-14
We present a continuous pseudo-hard-sphere potential based on a cut-and-shifted Mie (generalized Lennard-Jones) potential with exponents (50, 49). Using this potential one can mimic the volumetric, structural, and dynamic properties of the discontinuous hard-sphere potential over the whole fluid range. The continuous pseudo potential has the advantage that it may be incorporated directly into off-the-shelf molecular-dynamics code, allowing the user to capitalise on existing hardware and software advances. Simulation results for the compressibility factor of the fluid and solid phases of our pseudo hard spheres are presented and compared both to the Carnahan-Starling equation of state of the fluid and published data, the differences being indistinguishable within simulation uncertainty. The specific form of the potential is employed to simulate flexible chains formed from these pseudo hard spheres at contact (pearl-necklace model) for m(c) = 4, 5, 7, 8, 16, 20, 100, 201, and 500 monomer segments. The compressibility factor of the chains per unit of monomer, m(c), approaches a limiting value at reasonably small values, m(c) < 50, as predicted by Wertheim's first order thermodynamic perturbation theory. Simulation results are also presented for highly asymmetric mixtures of pseudo hard spheres, with diameter ratios of 3:1, 5:1, 20:1 over the whole composition range.
Creativity and Technology in Mathematics: From Story Telling to Algorithmic with Op'Art
ERIC Educational Resources Information Center
Mercat, Christian; Filho, Pedro Lealdino; El-Demerdash, Mohamed
2017-01-01
This article describes some of the results of the European project mcSquared (http://mc2-project.eu/) regarding the use of Op'Art and optical illusion pieces as a tool to foster modeling and creative mathematical thinking in students. We present briefly the c-book technology and some results we got experimenting it. The Op'Art movement, with…
A medical image-based graphical platform -- features, applications and relevance for brachytherapy.
Fonseca, Gabriel P; Reniers, Brigitte; Landry, Guillaume; White, Shane; Bellezzo, Murillo; Antunes, Paula C G; de Sales, Camila P; Welteman, Eduardo; Yoriyaz, Hélio; Verhaegen, Frank
2014-01-01
Brachytherapy dose calculation is commonly performed using the Task Group-No 43 Report-Updated protocol (TG-43U1) formalism. Recently, a more accurate approach has been proposed that can handle tissue composition, tissue density, body shape, applicator geometry, and dose reporting either in media or water. Some model-based dose calculation algorithms are based on Monte Carlo (MC) simulations. This work presents a software platform capable of processing medical images and treatment plans, and preparing the required input data for MC simulations. The A Medical Image-based Graphical platfOrm-Brachytherapy module (AMIGOBrachy) is a user interface, coupled to the MCNP6 MC code, for absorbed dose calculations. The AMIGOBrachy was first validated in water for a high-dose-rate (192)Ir source. Next, dose distributions were validated in uniform phantoms consisting of different materials. Finally, dose distributions were obtained in patient geometries. Results were compared against a treatment planning system including a linear Boltzmann transport equation (LBTE) solver capable of handling nonwater heterogeneities. The TG-43U1 source parameters are in good agreement with literature with more than 90% of anisotropy values within 1%. No significant dependence on the tissue composition was observed comparing MC results against an LBTE solver. Clinical cases showed differences up to 25%, when comparing MC results against TG-43U1. About 92% of the voxels exhibited dose differences lower than 2% when comparing MC results against an LBTE solver. The AMIGOBrachy can improve the accuracy of the TG-43U1 dose calculation by using a more accurate MC dose calculation algorithm. The AMIGOBrachy can be incorporated in clinical practice via a user-friendly graphical interface. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Bhola, Ruchi; Bhalla, Swaran; Gupta, Radha; Singh, Ishwar; Kumar, Sunil
2014-05-01
Literature suggests that glottic view is better when using McGrath(®) Video laryngoscope and Truview(®) in comparison with McIntosh blade. The purpose of this study was to evaluate the effectiveness of McGrath Video laryngoscope in comparison with Truview laryngoscope for tracheal intubation in patients with simulated cervical spine injury using manual in-line stabilisation. This prospective randomised study was undertaken in operation theatre of a tertiary referral centre after approval from the Institutional Review Board. A total of 100 consenting patients presenting for elective surgery requiring tracheal intubation were randomly assigned to undergo intubation using McGrath(®) Video laryngoscope (n = 50) or Truview(®) (n = 50) laryngoscope. In all patients, we applied manual-in-line stabilisation of the cervical spine throughout the airway management. Statistical testing was conducted with the statistical package for the social science system version SPSS 17.0. Demographic data, airway assessment and haemodynamics were compared using the Chi-square test. A P < 0.05 was considered significant. The time to successful intubation was less with McGrath video laryngoscope when compared to Truview (30.02 s vs. 38.72 s). However, there was no significant difference between laryngoscopic views obtained in both groups. The number of second intubation attempts required and incidence of complications were negligible with both devices. Success rate of intubation with both devices was 100%. Intubation with McGrath Video laryngoscope caused lesser alterations in haemodynamics. Both laryngoscopes are reliable in case of simulated cervical spine injury using manual-in-line stabilisation with 100% success rate and good glottic view.
MMU development at the Martin Marietta plant in Denver, Colorado
1980-07-25
S80-36889 (24 July 1980) --- Astronaut Bruce McCandless II uses a simulator at Martin Marietta?s space center near Denver to develop flight techniques for a backpack propulsion unit that will be used on Space Shuttle flights. The manned maneuvering unit (MMU) training simulator allows astronauts to "fly missions" against a full scale mockup of a portion of the orbiter vehicle. Controls of the simulator are like those of the actual MMU. Manipulating them allows the astronaut to move in three straight-line directions and in pitch, yaw and roll. One possible application of the MMU is for an extravehicular activity chore to repair damaged tiles on the vehicle. McCandless is wearing an extravehicular mobility unit (EMU).
Feaster, Toby D.; Westcott, Nancy E.; Hudson, Robert J.M.; Conrads, Paul; Bradley, Paul M.
2012-01-01
Rainfall is an important forcing function in most watershed models. As part of a previous investigation to assess interactions among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations in the Edisto River Basin, the topography-based hydrological model (TOPMODEL) was applied in the McTier Creek watershed in Aiken County, South Carolina. Measured rainfall data from six National Weather Service (NWS) Cooperative (COOP) stations surrounding the McTier Creek watershed were used to calibrate the McTier Creek TOPMODEL. Since the 1990s, the next generation weather radar (NEXRAD) has provided rainfall estimates at a finer spatial and temporal resolution than the NWS COOP network. For this investigation, NEXRAD-based rainfall data were generated at the NWS COOP stations and compared with measured rainfall data for the period June 13, 2007, to September 30, 2009. Likewise, these NEXRAD-based rainfall data were used with TOPMODEL to simulate streamflow in the McTier Creek watershed and then compared with the simulations made using measured rainfall data. NEXRAD-based rainfall data for non-zero rainfall days were lower than measured rainfall data at all six NWS COOP locations. The total number of concurrent days for which both measured and NEXRAD-based data were available at the COOP stations ranged from 501 to 833, the number of non-zero days ranged from 139 to 209, and the total difference in rainfall ranged from -1.3 to -21.6 inches. With the calibrated TOPMODEL, simulations using NEXRAD-based rainfall data and those using measured rainfall data produce similar results with respect to matching the timing and shape of the hydrographs. Comparison of the bias, which is the mean of the residuals between observed and simulated streamflow, however, reveals that simulations using NEXRAD-based rainfall tended to underpredict streamflow overall. Given that the total NEXRAD-based rainfall data for the simulation period is lower than the total measured rainfall at the NWS COOP locations, this bias would be expected. Therefore, to better assess the use of NEXRAD-based rainfall estimates as compared to NWS COOP rainfall data on the hydrologic simulations, TOPMODEL was recalibrated and updated simulations were made using the NEXRAD-based rainfall data. Comparisons of observed and simulated streamflow show that the TOPMODEL results using measured rainfall data and NEXRAD-based rainfall are comparable. Nonetheless, TOPMODEL simulations using NEXRAD-based rainfall still tended to underpredict total streamflow volume, although the magnitude of differences were similar to the simulations using measured rainfall. The McTier Creek watershed was subdivided into 12 subwatersheds and NEXRAD-based rainfall data were generated for each subwatershed. Simulations of streamflow were generated for each subwatershed using NEXRAD-based rainfall and compared with subwatershed simulations using measured rainfall data, which unlike the NEXRAD-based rainfall were the same data for all subwatersheds (derived from a weighted average of the six NWS COOP stations surrounding the basin). For the two simulations, subwatershed streamflow were summed and compared to streamflow simulations at two U.S. Geological Survey streamgages. The percentage differences at the gage near Monetta, South Carolina, were the same for simulations using measured rainfall data and NEXRAD-based rainfall. At the gage near New Holland, South Carolina, the percentage differences using the NEXRAD-based rainfall were twice as much as those using the measured rainfall. Single-mass curve comparisons showed an increase in the total volume of rainfall from north to south. Similar comparisons of the measured rainfall at the NWS COOP stations showed similar percentage differences, but the NEXRAD-based rainfall variations occurred over a much smaller distance than the measured rainfall. Nonetheless, it was concluded that in some cases, using NEXRAD-based rainfall data in TOPMODEL streamflow simulations may provide an effective alternative to using measured rainfall data. For this investigation, however, TOPMODEL streamflow simulations using NEXRAD-based rainfall data for both calibration and simulations did not show significant improvements with respect to matching observed streamflow over simulations generated using measured rainfall data.
Microcystin distribution in physical size class separations of natural plankton communities
Graham, J.L.; Jones, J.R.
2007-01-01
Phytoplankton communities in 30 northern Missouri and Iowa lakes were physically separated into 5 size classes (>100 ??m, 53-100 ??m, 35-53 ??m, 10-35 ??m, 1-10 ??m) during 15-21 August 2004 to determine the distribution of microcystin (MC) in size fractionated lake samples and assess how net collections influence estimates of MC concentration. MC was detected in whole water (total) from 83% of takes sampled, and total MC values ranged from 0.1-7.0 ??g/L (mean = 0.8 ??g/L). On average, MC in the > 100 ??m size class comprised ???40% of total MC, while other individual size classes contributed 9-20% to total MC. MC values decreased with size class and were significantly greater in the >100 ??m size class (mean = 0.5 ??g /L) than the 35-53 ??m (mean = 0.1 ??g/L), 10-35 ??m (mean = 0.0 ??g/L), and 1-10 ??m (mean = 0.0 ??g/L) size classes (p < 0.01). MC values in nets with 100-??m, 53-??m, 35-??m, and 10-??m mesh were cumulatively summed to simulate the potential bias of measuring MC with various size plankton nets. On average, a 100-??m net underestimated total MC by 51%, compared to 37% for a 53-??m net, 28% for a 35-??m net, and 17% for a 10-??m net. While plankton nets consistently underestimated total MC, concentration of algae with net sieves allowed detection of MC at low levels (???0.01 ??/L); 93% of lakes had detectable levels of MC in concentrated samples. Thus, small mesh plankton nets are an option for documenting MC occurrence, but whole water samples should be collected to characterize total MC concentrations. ?? Copyright by the North American Lake Management Society 2007.
Singular Spectrum Analysis for Astronomical Time Series: Constructing a Parsimonious Hypothesis Test
NASA Astrophysics Data System (ADS)
Greco, G.; Kondrashov, D.; Kobayashi, S.; Ghil, M.; Branchesi, M.; Guidorzi, C.; Stratta, G.; Ciszak, M.; Marino, F.; Ortolan, A.
We present a data-adaptive spectral method - Monte Carlo Singular Spectrum Analysis (MC-SSA) - and its modification to tackle astrophysical problems. Through numerical simulations we show the ability of the MC-SSA in dealing with 1/f β power-law noise affected by photon counting statistics. Such noise process is simulated by a first-order autoregressive, AR(1) process corrupted by intrinsic Poisson noise. In doing so, we statistically estimate a basic stochastic variation of the source and the corresponding fluctuations due to the quantum nature of light. In addition, MC-SSA test retains its effectiveness even when a significant percentage of the signal falls below a certain level of detection, e.g., caused by the instrument sensitivity. The parsimonious approach presented here may be broadly applied, from the search for extrasolar planets to the extraction of low-intensity coherent phenomena probably hidden in high energy transients.
NASA Astrophysics Data System (ADS)
He, An; Gong, Jiaming; Shikazono, Naoki
2018-05-01
In the present study, a model is introduced to correlate the electrochemical performance of solid oxide fuel cell (SOFC) with the 3D microstructure reconstructed by focused ion beam scanning electron microscopy (FIB-SEM) in which the solid surface is modeled by the marching cubes (MC) method. Lattice Boltzmann method (LBM) is used to solve the governing equations. In order to maintain the geometries reconstructed by the MC method, local effective diffusivities and conductivities computed based on the MC geometries are applied in each grid, and partial bounce-back scheme is applied according to the boundary predicted by the MC method. From the tortuosity factor and overpotential calculation results, it is concluded that the MC geometry drastically improves the computational accuracy by giving more precise topology information.
Electrons to Reactors Multiscale Modeling: Catalytic CO Oxidation over RuO 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, Jonathan E.; Lorenzi, Juan M.; Krogel, Jaron T.
First-principles kinetic Monte Carlo (1p-kMC) simulations for CO oxidation on two RuO 2 facets, RuO 2(110) and RuO 2(111), were coupled to the computational fluid dynamics (CFD) simulations package MFIX, and reactor-scale simulations were then performed. 1p-kMC coupled with CFD has recently been shown as a feasible method for translating molecular scale mechanistic knowledge to the reactor scale, enabling comparisons to in situ and online experimental measurements. Only a few studies with such coupling have been published. This work incorporates multiple catalytic surface facets into the scale-coupled simulation, and three possibilities were investigated: the two possibilities of each facet individuallymore » being the dominant phase in the reactor, and also the possibility that both facets were present on the catalyst particles in the ratio predicted by an ab initio thermodynamics-based Wulff construction. When lateral interactions between adsorbates were included in the 1p-kMC simulations, the two surfaces, RuO 2(110) and RuO 2(111), were found to be of similar order-of-magnitude in activity for the pressure range of 1 × 10 –4 bar to 1 bar, with the RuO 2(110) surface-termination showing more simulated activity than the RuO 2(111) surface-termination. Coupling between the 1p-kMC and CFD was achieved with a lookup table generated by the error-based modified Shepard interpolation scheme. Isothermal reactor scale simulations were performed and compared to two separate experimental studies, conducted with reactant partial pressures of ≤0.1 bar. Simulations without an isothermality restriction were also conducted and showed that the simulated temperature gradient across the catalytic reactor bed is <0.5 K, which validated the use of the isothermality restriction for investigating the reactor-scale phenomenological temperature dependences. The approach with the Wulff construction based reactor simulations reproduced a trend similar to one experimental data set relatively well, with the (110) surface being more active at higher temperaures; in contrast, for the other experimental data set, our reactor simulations achieve surprisingly and perhaps fortuitously good agreement with the activity and phenomenological pressure dependence when it is assumed that the (111) facet is the only active facet present. Lastly, the active phase of catalytic CO oxidation over RuO 2 remains unsettled, but the present study presents proof of principle (and progress) toward more accurate multiscale modeling from electrons to reactors and new simulation results.« less
Electrons to Reactors Multiscale Modeling: Catalytic CO Oxidation over RuO 2
Sutton, Jonathan E.; Lorenzi, Juan M.; Krogel, Jaron T.; ...
2018-04-20
First-principles kinetic Monte Carlo (1p-kMC) simulations for CO oxidation on two RuO 2 facets, RuO 2(110) and RuO 2(111), were coupled to the computational fluid dynamics (CFD) simulations package MFIX, and reactor-scale simulations were then performed. 1p-kMC coupled with CFD has recently been shown as a feasible method for translating molecular scale mechanistic knowledge to the reactor scale, enabling comparisons to in situ and online experimental measurements. Only a few studies with such coupling have been published. This work incorporates multiple catalytic surface facets into the scale-coupled simulation, and three possibilities were investigated: the two possibilities of each facet individuallymore » being the dominant phase in the reactor, and also the possibility that both facets were present on the catalyst particles in the ratio predicted by an ab initio thermodynamics-based Wulff construction. When lateral interactions between adsorbates were included in the 1p-kMC simulations, the two surfaces, RuO 2(110) and RuO 2(111), were found to be of similar order-of-magnitude in activity for the pressure range of 1 × 10 –4 bar to 1 bar, with the RuO 2(110) surface-termination showing more simulated activity than the RuO 2(111) surface-termination. Coupling between the 1p-kMC and CFD was achieved with a lookup table generated by the error-based modified Shepard interpolation scheme. Isothermal reactor scale simulations were performed and compared to two separate experimental studies, conducted with reactant partial pressures of ≤0.1 bar. Simulations without an isothermality restriction were also conducted and showed that the simulated temperature gradient across the catalytic reactor bed is <0.5 K, which validated the use of the isothermality restriction for investigating the reactor-scale phenomenological temperature dependences. The approach with the Wulff construction based reactor simulations reproduced a trend similar to one experimental data set relatively well, with the (110) surface being more active at higher temperaures; in contrast, for the other experimental data set, our reactor simulations achieve surprisingly and perhaps fortuitously good agreement with the activity and phenomenological pressure dependence when it is assumed that the (111) facet is the only active facet present. Lastly, the active phase of catalytic CO oxidation over RuO 2 remains unsettled, but the present study presents proof of principle (and progress) toward more accurate multiscale modeling from electrons to reactors and new simulation results.« less
GATE Monte Carlo simulation in a cloud computing environment
NASA Astrophysics Data System (ADS)
Rowedder, Blake Austin
The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.
SHIELD-HIT12A - a Monte Carlo particle transport program for ion therapy research
NASA Astrophysics Data System (ADS)
Bassler, N.; Hansen, D. C.; Lühr, A.; Thomsen, B.; Petersen, J. B.; Sobolevsky, N.
2014-03-01
Purpose: The Monte Carlo (MC) code SHIELD-HIT simulates the transport of ions through matter. Since SHIELD-HIT08 we added numerous features that improves speed, usability and underlying physics and thereby the user experience. The "-A" fork of SHIELD-HIT also aims to attach SHIELD-HIT to a heavy ion dose optimization algorithm to provide MC-optimized treatment plans that include radiobiology. Methods: SHIELD-HIT12A is written in FORTRAN and carefully retains platform independence. A powerful scoring engine is implemented scoring relevant quantities such as dose and track-average LET. It supports native formats compatible with the heavy ion treatment planning system TRiP. Stopping power files follow ICRU standard and are generated using the libdEdx library, which allows the user to choose from a multitude of stopping power tables. Results: SHIELD-HIT12A runs on Linux and Windows platforms. We experienced that new users quickly learn to use SHIELD-HIT12A and setup new geometries. Contrary to previous versions of SHIELD-HIT, the 12A distribution comes along with easy-to-use example files and an English manual. A new implementation of Vavilov straggling resulted in a massive reduction of computation time. Scheduled for later release are CT import and photon-electron transport. Conclusions: SHIELD-HIT12A is an interesting alternative ion transport engine. Apart from being a flexible particle therapy research tool, it can also serve as a back end for a MC ion treatment planning system. More information about SHIELD-HIT12A and a demo version can be found on http://www.shieldhit.org.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Du, X; Su, L
2014-06-15
Purpose: To compare the CT doses derived from the experiments and GPU-based Monte Carlo (MC) simulations, using a human cadaver and ATOM phantom. Methods: The cadaver of an 88-year old male and the ATOM phantom were scanned by a GE LightSpeed Pro 16 MDCT. For the cadaver study, the Thimble chambers (Model 10×5−0.6CT and 10×6−0.6CT) were used to measure the absorbed dose in different deep and superficial organs. Whole-body scans were first performed to construct a complete image database for MC simulations. Abdomen/pelvis helical scans were then conducted using 120/100 kVps, 300 mAs and a pitch factor of 1.375:1. Formore » the ATOM phantom study, the OSL dosimeters were used and helical scans were performed using 120 kVp and x, y, z tube current modulation (TCM). For the MC simulations, sufficient particles were run in both cases such that the statistical errors of the results by ARCHER-CT were limited to 1%. Results: For the human cadaver scan, the doses to the stomach, liver, colon, left kidney, pancreas and urinary bladder were compared. The difference between experiments and simulations was within 19% for the 120 kVp and 25% for the 100 kVp. For the ATOM phantom scan, the doses to the lung, thyroid, esophagus, heart, stomach, liver, spleen, kidneys and thymus were compared. The difference was 39.2% for the esophagus, and within 16% for all other organs. Conclusion: In this study the experimental and simulated CT doses were compared. Their difference is primarily attributed to the systematic errors of the MC simulations, including the accuracy of the bowtie filter modeling, and the algorithm to generate voxelized phantom from DICOM images. The experimental error is considered small and may arise from the dosimeters. R01 grant (R01EB015478) from National Institute of Biomedical Imaging and Bioengineering.« less
Validation of the Intelligibility in Context Scale for Jamaican Creole-Speaking Preschoolers.
Washington, Karla N; McDonald, Megan M; McLeod, Sharynne; Crowe, Kathryn; Devonish, Hubert
2017-08-15
To describe validation of the Intelligibility in Context Scale (ICS; McLeod, Harrison, & McCormack, 2012a) and ICS-Jamaican Creole (ICS-JC; McLeod, Harrison, & McCormack, 2012b) in a sample of typically developing 3- to 6-year-old Jamaicans. One-hundred and forty-five preschooler-parent dyads participated in the study. Parents completed the 7-item ICS (n = 145) and ICS-JC (n = 98) to rate children's speech intelligibility (5-point scale) across communication partners (parents, immediate family, extended family, friends, acquaintances, strangers). Preschoolers completed the Diagnostic Evaluation of Articulation and Phonology (DEAP; Dodd, Hua, Crosbie, Holm, & Ozanne, 2006) in English and Jamaican Creole to establish speech-sound competency. For this sample, we examined validity and reliability (interrater, test-rest, internal consistency) evidence using measures of speech-sound production: (a) percentage of consonants correct, (b) percentage of vowels correct, and (c) percentage of phonemes correct. ICS and ICS-JC ratings showed preschoolers were always (5) to usually (4) understood across communication partners (ICS, M = 4.43; ICS-JC, M = 4.50). Both tools demonstrated excellent internal consistency (α = .91), high interrater, and test-retest reliability. Significant correlations between the two tools and between each measure and language-specific percentage of consonants correct, percentage of vowels correct, and percentage of phonemes correct provided criterion-validity evidence. A positive correlation between the ICS and age further strengthened validity evidence for that measure. Both tools show promising evidence of reliability and validity in describing functional speech intelligibility for this group of typically developing Jamaican preschoolers.
NASA Astrophysics Data System (ADS)
Seo, Joo-Young; Park, Soo-Keun; Kwon, Hoon; Cho, Ki-Sub
2017-10-01
The mechanical properties of ultra-high-strength secondary hardened stainless steels with varying Co, V, and C contents have been studied. A reduced-Co alloy based on the chemical composition of Ferrium S53 was made by increasing the V and C content. This changed the M2C-strengthened microstructure to a MC plus M2C-strengthened microstructure, and no deteriorative effects were observed for peak-aged and over-aged samples despite the large reduction in Co content from 14 to 7 wt pct. The mechanical properties according to alloying modification were associated with carbide precipitation kinetics, which was clearly outlined by combining analytical tools including small-angle neutron scattering (SANS) as well as an analytical TEM with computational simulation.
Modelling the structural response of cotton plants to mepiquat chloride and population density
Gu, Shenghao; Evers, Jochem B.; Zhang, Lizhen; Mao, Lili; Zhang, Siping; Zhao, Xinhua; Liu, Shaodong; van der Werf, Wopke; Li, Zhaohu
2014-01-01
Background and Aims Cotton (Gossypium hirsutum) has indeterminate growth. The growth regulator mepiquat chloride (MC) is used worldwide to restrict vegetative growth and promote boll formation and yield. The effects of MC are modulated by complex interactions with growing conditions (nutrients, weather) and plant population density, and as a result the effects on plant form are not fully understood and are difficult to predict. The use of MC is thus hard to optimize. Methods To explore crop responses to plant density and MC, a functional–structural plant model (FSPM) for cotton (named CottonXL) was designed. The model was calibrated using 1 year's field data, and validated by using two additional years of detailed experimental data on the effects of MC and plant density in stands of pure cotton and in intercrops of cotton with wheat. CottonXL simulates development of leaf and fruits (square, flower and boll), plant height and branching. Crop development is driven by thermal time, population density, MC application, and topping of the main stem and branches. Key Results Validation of the model showed good correspondence between simulated and observed values for leaf area index with an overall root-mean-square error of 0·50 m2 m−2, and with an overall prediction error of less than 10 % for number of bolls, plant height, number of fruit branches and number of phytomers. Canopy structure became more compact with the decrease of leaf area index and internode length due to the application of MC. Moreover, MC did not have a substantial effect on boll density but increased lint yield at higher densities. Conclusions The model satisfactorily represents the effects of agronomic measures on cotton plant structure. It can be used to identify optimal agronomic management of cotton to achieve optimal plant structure for maximum yield under varying environmental conditions. PMID:24489020
A simple method to predict regional fish abundance: an example in the McKenzie River Basin, Oregon
D.J. McGarvey; J.M. Johnston
2011-01-01
Regional assessments of fisheries resources are increasingly called for, but tools with which to perform them are limited. We present a simple method that can be used to estimate regional carrying capacity and apply it to the McKenzie River Basin, Oregon. First, we use a macroecological model to predict trout densities within small, medium, and large streams in the...
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
Surface tension of undercooled liquid cobalt
NASA Astrophysics Data System (ADS)
Yao, W. J.; Han, X. J.; Chen, M.; Wei, B.; Guo, Z. Y.
2002-08-01
This paper provides the results on experimentally measured and numerically predicted surface tensions of undercooled liquid cobalt. The experiments were performed by using the oscillation drop technique combined with electromagnetic levitation. The simulations are carried out with the Monte Carlo (MC) method, where the surface tension is predicted through calculations of the work of cohesion, and the interatomic interaction is described with an embedded-atom method. The maximum undercooling of the liquid cobalt is reached at 231 K (0.13Tm) in the experiment and 268 K (0.17Tm) in the simulation. The surface tension and its relationship with temperature obtained in the experiment and simulation are σexp = 1.93 - 0.000 33 (T - T m) N m-1 and σcal = 2.26 - 0.000 32 (T - T m) N m-1 respectively. The temperature dependence of the surface tension calculated from the MC simulation is in reasonable agreement with that measured in the experiment.
The NASA Human Research Wiki - An Online Collaboration Tool
NASA Technical Reports Server (NTRS)
Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi
2012-01-01
The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.
Development of accelerated Raman and fluorescent Monte Carlo method
NASA Astrophysics Data System (ADS)
Dumont, Alexander P.; Patil, Chetan
2018-02-01
Monte Carlo (MC) modeling of photon propagation in turbid media is an essential tool for understanding optical interactions between light and tissue. Insight gathered from outputs of MC models assists in mapping between detected optical signals and bulk tissue optical properties, and as such, has proven useful for inverse calculations of tissue composition and optimization of the design of optical probes. MC models of Raman scattering have previously been implemented without consideration to background autofluorescence, despite its presence in raw measurements. Modeling both Raman and fluorescence profiles at high spectral resolution requires a significant increase in computation, but is more appropriate for investigating issues such as detection limits. We present a new Raman Fluorescence MC model developed atop an existing GPU parallelized MC framework that can run more than 300x times faster than CPU methods. The robust acceleration allows for the efficient production of both Raman and fluorescence outputs from the MC model. In addition, this model can handle arbitrary sample morphologies of excitation and collection geometries to more appropriately mimic experimental settings. We will present the model framework and initial results.
Theoretical Models of Protostellar Binary and Multiple Systems with AMR Simulations
NASA Astrophysics Data System (ADS)
Matsumoto, Tomoaki; Tokuda, Kazuki; Onishi, Toshikazu; Inutsuka, Shu-ichiro; Saigo, Kazuya; Takakuwa, Shigehisa
2017-05-01
We present theoretical models for protostellar binary and multiple systems based on the high-resolution numerical simulation with an adaptive mesh refinement (AMR) code, SFUMATO. The recent ALMA observations have revealed early phases of the binary and multiple star formation with high spatial resolutions. These observations should be compared with theoretical models with high spatial resolutions. We present two theoretical models for (1) a high density molecular cloud core, MC27/L1521F, and (2) a protobinary system, L1551 NE. For the model for MC27, we performed numerical simulations for gravitational collapse of a turbulent cloud core. The cloud core exhibits fragmentation during the collapse, and dynamical interaction between the fragments produces an arc-like structure, which is one of the prominent structures observed by ALMA. For the model for L1551 NE, we performed numerical simulations of gas accretion onto protobinary. The simulations exhibit asymmetry of a circumbinary disk. Such asymmetry has been also observed by ALMA in the circumbinary disk of L1551 NE.
NASA Astrophysics Data System (ADS)
Dünser, Simon; Meyer, Daniel W.
2016-06-01
In most groundwater aquifers, dispersion of tracers is dominated by flow-field inhomogeneities resulting from the underlying heterogeneous conductivity or transmissivity field. This effect is referred to as macrodispersion. Since in practice, besides a few point measurements the complete conductivity field is virtually never available, a probabilistic treatment is needed. To quantify the uncertainty in tracer concentrations from a given geostatistical model for the conductivity, Monte Carlo (MC) simulation is typically used. To avoid the excessive computational costs of MC, the polar Markovian velocity process (PMVP) model was recently introduced delivering predictions at about three orders of magnitude smaller computing times. In artificial test cases, the PMVP model has provided good results in comparison with MC. In this study, we further validate the model in a more challenging and realistic setup. The setup considered is derived from the well-known benchmark macrodispersion experiment (MADE), which is highly heterogeneous and non-stationary with a large number of unevenly scattered conductivity measurements. Validations were done against reference MC and good overall agreement was found. Moreover, simulations of a simplified setup with a single measurement were conducted in order to reassess the model's most fundamental assumptions and to provide guidance for model improvements.
Gravity affects the responsiveness of Runx2 to 1, 25-dihydroxyvitamin D3 (VD3)
NASA Astrophysics Data System (ADS)
Guo, Feima; Dai, Zhongquan; Wu, Feng; Liu, Zhaoxia; Tan, Yingjun; Wan, Yumin; Shang, Peng; Li, Yinghui
2013-03-01
Bone loss resulting from spaceflight is mainly caused by decreased bone formation, and decreased osteoblast proliferation and differentiation. Transcription factor Runx2 plays an important role in osteoblast differentiation and function by responding to microenvironment changes including cytokine and mechanical factors. The effects of 1, 25-dihydroxyvitamin D3 (VD3) on Runx2 in terms of mechanical competence is far less clear. This study describes how gravity affects the response of Runx2 to VD3. A MC3T3-6OSE2-Luc osteoblast model was constructed in which the activity of Runx2 was reflected by reporter luciferase activity identifed by bone-related cytokines. The results showed that luciferase activity in MC3T3-6OSE2-Luc cells transfected with Runx2 was twice that of the vacant vector. Alkaline phosphatase (ALP) activity was increased in MC3T3-6OSE2-Luc cells by different concentrations of IGF-I and BMP2. MC3T3-6OSE2-Luc cells were cultured under simulated microgravity or centrifuge with or without VD3. In simulated microgravity, luciferase activity was decreased after 48 h of clinorotation culture, but increased in the centrifuge culture. Luciferase activity was increased after VD3 treatment in normal conditions and simulated microgravity, the increase in luciferase activity in simulated microgravity was lower than that in the 1 g condition when simultaneously treated with VD3 and higher than that in the centrifuge condition. Co-immunoprecipitation showed that the interaction between the VD3 receptor (VDR) and Runx2 was decreased by simulated microgravity, but increased by centrifugation. From these results, we conclude that gravity affects the response of Runx2 to VD3 which results from an alteration in the interaction between VDR and Runx2 under different gravity conditions.
Importance of including ammonium sulfate ((NH4)2SO4) aerosols for ice cloud parameterization in GCMs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharjee, P. S.; Sud, Yogesh C.; Liu, Xiaohong
2010-02-22
A common deficiency of many cloud-physics parameterizations including the NASA’s microphysics of clouds with aerosol- cloud interactions (hereafter called McRAS-AC) is that they simulate less (larger) than the observed ice cloud particle number (size). A single column model (SCM) of McRAS-AC and Global Circulation Model (GCM) physics together with an adiabatic parcel model (APM) for ice-cloud nucleation (IN) of aerosols were used to systematically examine the influence of ammonium sulfate ((NH4)2SO4) aerosols, not included in the present formulations of McRAS-AC. Specifically, the influence of (NH4)2SO4 aerosols on the optical properties of both liquid and ice clouds were analyzed. First anmore » (NH4)2SO4 parameterization was included in the APM to assess its effect vis-à-vis that of the other aerosols. Subsequently, several evaluation tests were conducted over the ARM-SGP and thirteen other locations (sorted into pristine and polluted conditions) distributed over marine and continental sites with the SCM. The statistics of the simulated cloud climatology were evaluated against the available ground and satellite data. The results showed that inclusion of (NH4)2SO4 in the SCM made a remarkable improvement in the simulated effective radius of ice clouds. However, the corresponding ice-cloud optical thickness increased more than is observed. This can be caused by lack of cloud advection and evaporation. We argue that this deficiency can be mitigated by adjusting the other tunable parameters of McRAS-AC such as precipitation efficiency. Inclusion of ice cloud particle splintering introduced through well- established empirical equations is found to further improve the results. Preliminary tests show that these changes make a substantial improvement in simulating the cloud optical properties in the GCM, particularly by simulating a far more realistic cloud distribution over the ITCZ.« less
Mirzaeinia, Ali; Feyzi, Farzaneh; Hashemianzadeh, Seyed Majid
2017-12-07
Simple and accurate expressions are presented for the equation of state (EOS) and absolute Helmholtz free energy of a system composed of simple atomic particles interacting through the repulsive Lennard-Jones potential model in the fluid and solid phases. The introduced EOS has 17 and 22 coefficients for fluid and solid phases, respectively, which are regressed to the Monte Carlo (MC) simulation data over the reduced temperature range of 0.6≤T * ≤6.0 and the packing fraction range of 0.1 ≤ η ≤ 0.72. The average absolute relative percent deviation in fitting the EOS parameters to the MC data is 0.06 and 0.14 for the fluid and solid phases, respectively. The thermodynamic integration method is used to calculate the free energy using the MC simulation results. The Helmholtz free energy of the ideal gas is employed as the reference state for the fluid phase. For the solid phase, the values of the free energy at the reduced density equivalent to the close-packed of a hard sphere are used as the reference state. To check the validity of the predicted values of the Helmholtz free energy, the Widom particle insertion method and the Einstein crystal technique of Frenkel and Ladd are employed. The results obtained from the MC simulation approaches are well agreed to the EOS results, which show that the proposed model can reliably be utilized in the framework of thermodynamic theories.
NASA Astrophysics Data System (ADS)
Mirzaeinia, Ali; Feyzi, Farzaneh; Hashemianzadeh, Seyed Majid
2017-12-01
Simple and accurate expressions are presented for the equation of state (EOS) and absolute Helmholtz free energy of a system composed of simple atomic particles interacting through the repulsive Lennard-Jones potential model in the fluid and solid phases. The introduced EOS has 17 and 22 coefficients for fluid and solid phases, respectively, which are regressed to the Monte Carlo (MC) simulation data over the reduced temperature range of 0.6 ≤T*≤6.0 and the packing fraction range of 0.1 ≤ η ≤ 0.72. The average absolute relative percent deviation in fitting the EOS parameters to the MC data is 0.06 and 0.14 for the fluid and solid phases, respectively. The thermodynamic integration method is used to calculate the free energy using the MC simulation results. The Helmholtz free energy of the ideal gas is employed as the reference state for the fluid phase. For the solid phase, the values of the free energy at the reduced density equivalent to the close-packed of a hard sphere are used as the reference state. To check the validity of the predicted values of the Helmholtz free energy, the Widom particle insertion method and the Einstein crystal technique of Frenkel and Ladd are employed. The results obtained from the MC simulation approaches are well agreed to the EOS results, which show that the proposed model can reliably be utilized in the framework of thermodynamic theories.
Simulating Silicon Photomultiplier Response to Scintillation Light
Jha, Abhinav K.; van Dam, Herman T.; Kupinski, Matthew A.; Clarkson, Eric
2015-01-01
The response of a Silicon Photomultiplier (SiPM) to optical signals is affected by many factors including photon-detection efficiency, recovery time, gain, optical crosstalk, afterpulsing, dark count, and detector dead time. Many of these parameters vary with overvoltage and temperature. When used to detect scintillation light, there is a complicated non-linear relationship between the incident light and the response of the SiPM. In this paper, we propose a combined discrete-time discrete-event Monte Carlo (MC) model to simulate SiPM response to scintillation light pulses. Our MC model accounts for all relevant aspects of the SiPM response, some of which were not accounted for in the previous models. We also derive and validate analytic expressions for the single-photoelectron response of the SiPM and the voltage drop across the quenching resistance in the SiPM microcell. These analytic expressions consider the effect of all the circuit elements in the SiPM and accurately simulate the time-variation in overvoltage across the microcells of the SiPM. Consequently, our MC model is able to incorporate the variation of the different SiPM parameters with varying overvoltage. The MC model is compared with measurements on SiPM-based scintillation detectors and with some cases for which the response is known a priori. The model is also used to study the variation in SiPM behavior with SiPM-circuit parameter variations and to predict the response of a SiPM-based detector to various scintillators. PMID:26236040
Sneessens, I; Veysset, P; Benoit, M; Lamadon, A; Brunschwig, G
2016-11-01
Crop-livestock production is claimed more sustainable than specialized production systems. However, the presence of controversial studies suggests that there must be conditions of mixing crop and livestock productions to allow for higher sustainable performances. Whereas previous studies focused on the impact of crop-livestock interactions on performances, we posit here that crop-livestock organization is a key determinant of farming system sustainability. Crop-livestock organization refers to the percentage of the agricultural area that is dedicated to each production. Our objective is to investigate if crop-livestock organization has both a direct and an indirect impact on mixed crop-livestock (MC-L) sustainability. In that objective, we build a whole-farm model parametrized on representative French sheep and crop farming systems in plain areas (Vienne, France). This model permits simulating contrasted MC-L systems and their subsequent sustainability through the following indicators of performance: farm income, production, N balance, greenhouse gas (GHG) emissions (/kg product) and MJ consumption (/kg product). Two MC-L systems were simulated with contrasted crop-livestock organizations (MC20-L80: 20% of crops; MC80-L20: 80% of crops). A first scenario - constraining no crop-livestock interactions in both MC-L systems - permits highlighting that crop-livestock organization has a significant direct impact on performances that implies trade-offs between objectives of sustainability. Indeed, the MC80-L20 system is showing higher performances for farm income (+44%), livestock production (+18%) and crop GHG emissions (-14%) whereas the MC20-L80 system has a better N balance (-53%) and a lower livestock MJ consumption (-9%). A second scenario - allowing for crop-livestock interactions in both MC20-L80 and MC80-L20 systems - stated that crop-livestock organization has a significant indirect impact on performances. Indeed, even if crop-livestock interactions permit improving performances, crop-livestock organization influences the capacity of MC-L systems to benefit from crop-livestock interactions. As a consequence, we observed a decreasing performance trade-off between MC-L systems for farm income (-4%) and crop GHG emissions (-10%) whereas the gap increases for nitrogen balance (+23%), livestock production (+6%) - MJ consumption (+16%) - GHG emissions (+5%) and crop MJ consumption (+5%). However, the indirect impact of crop-livestock organization doesn't reverse the trend of trade-offs between objectives of sustainability determined by the direct impact of crop-livestock organization. As a conclusion, crop-livestock organization is a key factor that has to be taken into account when studying the sustainability of mixed crop-livestock systems.
Destruction of a Magnetized Star
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2017-01-01
What happens when a magnetized star is torn apart by the tidal forces of a supermassive black hole, in a violent process known as a tidal disruption event? Two scientists have broken new ground by simulating the disruption of stars with magnetic fields for the first time.The magnetic field configuration during a simulation of the partial disruption of a star. Top left: pre-disruption star. Bottom left: matter begins to re-accrete onto the surviving core after the partial disruption. Right: vortices form in the core as high-angular-momentum debris continues to accrete, winding up and amplifying the field. [Adapted from Guillochon McCourt 2017]What About Magnetic Fields?Magnetic fields are expected to exist in the majority of stars. Though these fields dont dominate the energy budget of a star the magnetic pressure is a million times weaker than the gas pressure in the Suns interior, for example they are the drivers of interesting activity, like the prominences and flares of our Sun.Given this, we can wonder what role stars magnetic fields might play when the stars are torn apart in tidal disruption events. Do the fields change what we observe? Are they dispersed during the disruption, or can they be amplified? Might they even be responsible for launching jets of matter from the black hole after the disruption?Star vs. Black HoleIn a recent study, James Guillochon (Harvard-Smithsonian Center for Astrophysics) and Michael McCourt (Hubble Fellow at UC Santa Barbara) have tackled these questions by performing the first simulations of tidal disruptions of stars that include magnetic fields.In their simulations, Guillochon and McCourt evolve a solar-mass star that passes close to a million-solar-mass black hole. Their simulations explore different magnetic field configurations for the star, and they consider both what happens when the star barely grazes the black hole and is only partially disrupted, as well as what happens when the black hole tears the star apart completely.Amplifying EncountersFor stars that survive their encounter with the black hole, Guillochon and McCourt find that the process of partial disruption and re-accretion can amplify the magnetic field of the star by up to a factor of 20. Repeated encounters of the star with the black hole could amplify the field even more.The authors suggest an interesting implication of this idea: a population of highly magnetized stars may have formed in our own galactic center, resulting from their encounters with the supermassive black hole Sgr A*.A turbulent magnetic field forms after a partial stellar disruption and re-accretion of the tidal tails. [Adapted from Guillochon McCourt 2017]Effects in DestructionFor stars that are completely shredded and form a tidal stream after their encounter with the black hole, the authors find that the magnetic field geometry straightens within the stream of debris. There, the pressure of the magnetic field eventually dominates over the gas pressure and self-gravity.Guillochon and McCourt find that the fields new configuration isnt ideal for powering jets from the black hole but it is strong enough to influence how the stream interacts with itself and its surrounding environment, likely affecting what we can expect to see from these short-lived events.These simulations have clearly demonstrated the need to further explore the role of magnetic fields in the disruptions of stars by black holes.BonusCheck out the full (brief) video from one of the simulations by Guillochon and McCourt (be sure to watch it in high-res!). It reveals the evolution of a stars magnetic field configuration as the star is partially disrupted by the forces of a supermassive black hole and then re-accretes.CitationJames Guillochon and Michael McCourt 2017 ApJL 834 L19. doi:10.3847/2041-8213/834/2/L19
Major chest wall reconstruction after chest wall irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, D.L.; McMurtrey, M.J.; Howe, H.J.
1982-03-15
In the last year, 12 patients have undergone extensive chest wall resection. Eight patients had recurrent cancer after prior resection and irradiation with an average defect of 160 square centimeters, usually including ribs and a portion of the sternum; four had radionecrosis of soft tissue and/or bone. Methods of reconstruction included latissimus dorsi musculocutaneous (MC) flap (five patients), pectoralis major MC flap (seven patients), and omental flap and skin graft (one patient). The donor site was usually closed primarily. All flaps survived providing good wound coverage. The only complication was partial loss of a latissimus dorsi MC flap related tomore » an infected wound; this reconstruction was salvaged with a pectoralis major MC flap. The hospital stay ranged from 10-25 days with a median stay of 11 days. Use of the MC flap is a valuable tool which can be used to significantly decrease morbidity, hospital stay, and patient discomfort related to the difficult problem of chest wall reconstruction after radiation therapy.« less
Astronauts Grissom and Young in Gemini Mission Simulator
1964-05-22
S64-25295 (March 1964) --- Astronauts Virgil I. (Gus) Grissom (right) and John W. Young, prime crew for the first manned Gemini mission (GT-3), are shown inside a Gemini mission simulator at McDonnell Aircraft Corp., St. Louis, MO. The simulator will provide Gemini astronauts and ground crews with realistic mission simulation during intensive training prior to actual launch.
An assessment of 'shuffle algorithm' collision mechanics for particle simulations
NASA Technical Reports Server (NTRS)
Feiereisen, William J.; Boyd, Iain D.
1991-01-01
Among the algorithms for collision mechanics used at present, the 'shuffle algorithm' of Baganoff (McDonald and Baganoff, 1988; Baganoff and McDonald, 1990) not only allows efficient vectorization, but also discretizes the possible outcomes of a collision. To assess the applicability of the shuffle algorithm, a simulation was performed of flows in monoatomic gases and the calculated characteristics of shock waves was compared with those obtained using a commonly employed isotropic scattering law. It is shown that, in general, the shuffle algorithm adequately represents the collision mechanics in cases when the goal of calculations are mean profiles of density and temperature.
Simulation of temperature distribution in tumor Photothermal treatment
NASA Astrophysics Data System (ADS)
Zhang, Xiyang; Qiu, Shaoping; Wu, Shulian; Li, Zhifang; Li, Hui
2018-02-01
The light transmission in biological tissue and the optical properties of biological tissue are important research contents of biomedical photonics. It is of great theoretical and practical significance in medical diagnosis and light therapy of disease. In this paper, the temperature feedback-controller was presented for monitoring photothermal treatment in realtime. Two-dimensional Monte Carlo (MC) and diffuse approximation were compared and analyzed. The results demonstrated that diffuse approximation using extrapolated boundary conditions by finite element method is a good approximation to MC simulation. Then in order to minimize thermal damage, real-time temperature monitoring was appraised by proportional-integral-differential (PID) controller in the process of photothermal treatment.
Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.
2016-03-01
The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.
Transient in-plane thermal transport in nanofilms with internal heating
Cao, Bing-Yang
2016-01-01
Wide applications of nanofilms in electronics necessitate an in-depth understanding of nanoscale thermal transport, which significantly deviates from Fourier's law. Great efforts have focused on the effective thermal conductivity under temperature difference, while it is still ambiguous whether the diffusion equation with an effective thermal conductivity can accurately characterize the nanoscale thermal transport with internal heating. In this work, transient in-plane thermal transport in nanofilms with internal heating is studied via Monte Carlo (MC) simulations in comparison to the heat diffusion model and mechanism analyses using Fourier transform. Phonon-boundary scattering leads to larger temperature rise and slower thermal response rate when compared with the heat diffusion model based on Fourier's law. The MC simulations are also compared with the diffusion model with effective thermal conductivity. In the first case of continuous internal heating, the diffusion model with effective thermal conductivity under-predicts the temperature rise by the MC simulations at the initial heating stage, while the deviation between them gradually decreases and vanishes with time. By contrast, for the one-pulse internal heating case, the diffusion model with effective thermal conductivity under-predicts both the peak temperature rise and the cooling rate, so the deviation can always exist. PMID:27118903
Self-Consistent Monte Carlo Study of the Coulomb Interaction under Nano-Scale Device Structures
NASA Astrophysics Data System (ADS)
Sano, Nobuyuki
2011-03-01
It has been pointed that the Coulomb interaction between the electrons is expected to be of crucial importance to predict reliable device characteristics. In particular, the device performance is greatly degraded due to the plasmon excitation represented by dynamical potential fluctuations in high-doped source and drain regions by the channel electrons. We employ the self-consistent 3D Monte Carlo (MC) simulations, which could reproduce both the correct mobility under various electron concentrations and the collective plasma waves, to study the physical impact of dynamical potential fluctuations on device performance under the Double-gate MOSFETs. The average force experienced by an electron due to the Coulomb interaction inside the device is evaluated by performing the self-consistent MC simulations and the fixed-potential MC simulations without the Coulomb interaction. Also, the band-tailing associated with the local potential fluctuations in high-doped source region is quantitatively evaluated and it is found that the band-tailing becomes strongly dependent of position in real space even inside the uniform source region. This work was partially supported by Grants-in-Aid for Scientific Research B (No. 2160160) from the Ministry of Education, Culture, Sports, Science and Technology in Japan.
Transient in-plane thermal transport in nanofilms with internal heating.
Hua, Yu-Chao; Cao, Bing-Yang
2016-02-01
Wide applications of nanofilms in electronics necessitate an in-depth understanding of nanoscale thermal transport, which significantly deviates from Fourier's law. Great efforts have focused on the effective thermal conductivity under temperature difference, while it is still ambiguous whether the diffusion equation with an effective thermal conductivity can accurately characterize the nanoscale thermal transport with internal heating. In this work, transient in-plane thermal transport in nanofilms with internal heating is studied via Monte Carlo (MC) simulations in comparison to the heat diffusion model and mechanism analyses using Fourier transform. Phonon-boundary scattering leads to larger temperature rise and slower thermal response rate when compared with the heat diffusion model based on Fourier's law. The MC simulations are also compared with the diffusion model with effective thermal conductivity. In the first case of continuous internal heating, the diffusion model with effective thermal conductivity under-predicts the temperature rise by the MC simulations at the initial heating stage, while the deviation between them gradually decreases and vanishes with time. By contrast, for the one-pulse internal heating case, the diffusion model with effective thermal conductivity under-predicts both the peak temperature rise and the cooling rate, so the deviation can always exist.
Simulation of substrate degradation in composting of sewage sludge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Jun; Gao Ding, E-mail: gaod@igsnrr.ac.c; Chen Tongbin
2010-10-15
To simulate the substrate degradation kinetics of the composting process, this paper develops a mathematical model with a first-order reaction assumption and heat/mass balance equations. A pilot-scale composting test with a mixture of sewage sludge and wheat straw was conducted in an insulated reactor. The BVS (biodegradable volatile solids) degradation process, matrix mass, MC (moisture content), DM (dry matter) and VS (volatile solid) were simulated numerically by the model and experimental data. The numerical simulation offered a method for simulating k (the first-order rate constant) and estimating k{sub 20} (the first-order rate constant at 20 {sup o}C). After comparison withmore » experimental values, the relative error of the simulation value of the mass of the compost at maturity was 0.22%, MC 2.9%, DM 4.9% and VS 5.2%, which mean that the simulation is a good fit. The k of sewage sludge was simulated, and k{sub 20}, k{sub 20s} (first-order rate coefficient of slow fraction of BVS at 20 {sup o}C) of the sewage sludge were estimated as 0.082 and 0.015 d{sup -1}, respectively.« less
Astronaut William S. McArthur in training for contingency EVA in WETF
1993-09-10
S93-43840 (6 Sept 1993) --- Astronaut William S. McArthur, mission specialist, participates in training for contingency Extravehicular Activity (EVA) for the STS-58 mission. For simulation purposes, McArthur was about to be submerged to a point of neutral buoyancy in the Johnson Space Center's (JSC) Weightless Environment Training Facility (WET-F). Though the Spacelab Life Sciences (SLS-2) mission does not include a planned EVA, all crews designate members to learn proper procedures to perform outside the spacecraft in the event of failure of remote means to accomplish those tasks.
New features in McStas, version 1.5
NASA Astrophysics Data System (ADS)
Åstrand, P.-O.; Lefmann, K.; Farhi, E.; Nielsen, K.; Skårup, P.
The neutron ray-tracing simulation package McStas has attracted numerous users, and the development of the package continues with version 1.5 released at the ICNS 2001 conference. New features include: support for neutron polarisation, labelling of neutrons, realistic source and sample components, and interface to the Riso instrument-control software TASCOM. We give a general introduction to McStas and present the latest developments. In particular, we give an example of how the neutron-label option has been used to locate the origin of a spurious side-peak, observed in an experiment with RITA-1 at Riso.
ERIC Educational Resources Information Center
Bunting, John David
2013-01-01
Despite claims that the use of corpus tools can have a major impact in language classrooms (e.g., Conrad, 2000, 2004; Davies, 2004; O'Keefe, McCarthy, & Carter, 2007; Sinclair, 2004b; Tsui, 2004), many language teachers express apparent apathy or even resistance towards adding corpus tools to their repertoire (Cortes, 2013b). This study…
Business Simulations in Financial Management Courses: Implications for Higher Education
ERIC Educational Resources Information Center
Wolmarans, H. P.
2006-01-01
Business simulations provide a teaching method that typically yields (1) more hands-on experience, (2) a higher level of excitement, (3) a higher noise level (and yet a lower incidence of problems), and (4) more commitment than traditional methods of teaching (McLure 1997, 3). Business simulations are experiential learning opportunities that have…
Tessonnier, Thomas; Marcelos, Tiago; Mairani, Andrea; Brons, Stephan; Parodi, Katia
2015-01-01
In the field of radiation therapy, accurate and robust dose calculation is required. For this purpose, precise modeling of the irradiation system and reliable computational platforms are needed. At the Heidelberg Ion Therapy Center (HIT), the beamline has been already modeled in the FLUKA Monte Carlo (MC) code. However, this model was kept confidential for disclosure reasons and was not available for any external team. The main goal of this study was to create efficiently phase space (PS) files for proton and carbon ion beams, for all energies and foci available at HIT. PSs are representing the characteristics of each particle recorded (charge, mass, energy, coordinates, direction cosines, generation) at a certain position along the beam path. In order to achieve this goal, keeping a reasonable data size but maintaining the requested accuracy for the calculation, we developed a new approach of beam PS generation with the MC code FLUKA. The generated PSs were obtained using an infinitely narrow beam and recording the desired quantities after the last element of the beamline, with a discrimination of primaries or secondaries. In this way, a unique PS can be used for each energy to accommodate the different foci by combining the narrow-beam scenario with a random sampling of its theoretical Gaussian beam in vacuum. PS can also reproduce the different patterns from the delivery system, when properly combined with the beam scanning information. MC simulations using PS have been compared to simulations, including the full beamline geometry and have been found in very good agreement for several cases (depth dose distributions, lateral dose profiles), with relative dose differences below 0.5%. This approach has also been compared with measured data of ion beams with different energies and foci, resulting in a very satisfactory agreement. Hence, the proposed approach was able to fulfill the different requirements and has demonstrated its capability for application to clinical treatment fields. It also offers a powerful tool to perform investigations on the contribution of primary and secondary particles produced in the beamline. These PSs are already made available to external teams upon request, to support interpretation of their measurements.
Pauchard, J-Y; Gehri, M; Vaudaux, B
2013-01-16
The McIsaac scoring system is a tool designed to predict the probability of streptococcal pharyngitis in children aged 3 to 17 years with a sore throat. Although it does not allow the physician to make the diagnosis of streptococcal pharyngitis, it enables to identify those children with a sore throat in whom rapid antigen detection tests have a good predictive value.
SU-F-T-610: Comparison of Output Factors for Small Radiation Fields Used in SBRT Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, R; Eldib, A; Li, J
2016-06-15
Purpose: In order to fundamentally understand our previous dose verification results between measurements and calculations from treatment planning system (TPS) for SBRT plans for different sized targets, the goal of the present work was to compare output factors for small fields measured using EDR2 films with TPS and Monet Carlo (MC) simulations. Methods: 6MV beam was delivered to EDR2 films for each of the following field sizes; 1×1 cm{sup 2}, 1.5×1.5 cm{sup 2}, 2×2 cm{sup 2}, 3×3 cm{sup 2}, 4×4 cm{sup 2}, 5×5 cm{sup 2} and 10×10 cm{sup 2}. The films were developed in a film processer, then scanned withmore » a Vidar VXR-16 scanner and analyzed using RIT113 version 6.1. A standard calibration curve was obtained with the 6MV beam and was used to get absolute dose for measured field sizes. Similar plans for all fields sizes mentioned above were generated using Eclipse with the Analytical Anisotropic Algorithm. Similarly, MC simulations were carried out using the MCSIM, an in-house MC code for different field sizes. Output factors normalized to 10×10 cm{sup 2} reference field were calculated for different field sizes in all the three cases and compared. Results: For field sizes ranging from 1×1 cm{sup 2} to 2×2 cm{sup 2}, the differences in output factors between measurements (films), TPS and MC simulations were within 0.22%. For field sizes ranging from 3×3cm{sup 2} to 5×5cm{sup 2}, differences in output factors were within 0.10%. Conclusion: No clinically significant difference was obtained in output factors for different field sizes acquired from films, TPS and MC simulations. Our results showed that the output factors are predicted accurately from TPS when compared to the actual measurements and superior dose calculation Monte Carlo method. This study would help us in understanding our previously obtained dose verification results for small fields used in the SBRT treatment.« less
NASA Astrophysics Data System (ADS)
Aziz Hashikin, Nurul Ab; Yeong, Chai-Hong; Guatelli, Susanna; Jeet Abdullah, Basri Johan; Ng, Kwan-Hoong; Malaroda, Alessandra; Rosenfeld, Anatoly; Perkins, Alan Christopher
2017-09-01
We aimed to investigate the validity of the partition model (PM) in estimating the absorbed doses to liver tumour ({{D}T} ), normal liver tissue ({{D}NL} ) and lungs ({{D}L} ), when cross-fire irradiations between these compartments are being considered. MIRD-5 phantom incorporated with various treatment parameters, i.e. tumour involvement (TI), tumour-to-normal liver uptake ratio (T/N) and lung shunting (LS), were simulated using the Geant4 Monte Carlo (MC) toolkit. 108 track histories were generated for each combination of the three parameters to obtain the absorbed dose per activity uptake in each compartment (DT{{AT}} , DNL{{ANL}} , and DL{{AL}} ). The administered activities, A were estimated using PM, so as to achieve either limiting doses to normal liver, DNLlim or lungs, ~DLlim (70 or 30 Gy, respectively). Using these administered activities, the activity uptake in each compartment ({{A}T} , {{A}NL} , and {{A}L} ) was estimated and multiplied with the absorbed dose per activity uptake attained using the MC simulations, to obtain the actual dose received by each compartment. PM overestimated {{D}L} by 11.7% in all cases, due to the escaped particles from the lungs. {{D}T} and {{D}NL} by MC were largely affected by T/N, which were not considered by PM due to cross-fire exclusion at the tumour-normal liver boundary. These have resulted in the overestimation of {{D}T} by up to 8% and underestimation of {{D}NL} by as high as -78%, by PM. When DNLlim was estimated via PM, the MC simulations showed significantly higher {{D}NL} for cases with higher T/N, and LS ⩽ 10%. All {{D}L} and {{D}T} by MC were overestimated by PM, thus DLlim were never exceeded. PM leads to inaccurate dose estimations due to the exclusion of cross-fire irradiation, i.e. between the tumour and normal liver tissue. Caution should be taken for cases with higher TI and T/N, and lower LS, as they contribute to major underestimation of {{D}NL} . For {{D}L} , a different correction factor for dose calculation may be used for improved accuracy.
NASA Astrophysics Data System (ADS)
Preston, L. A.
2017-12-01
Marine hydrokinetic (MHK) devices offer a clean, renewable alternative energy source for the future. Responsible utilization of MHK devices, however, requires that the effects of acoustic noise produced by these devices on marine life and marine-related human activities be well understood. Paracousti is a 3-D full waveform acoustic modeling suite that can accurately propagate MHK noise signals in the complex bathymetry found in the near-shore to open ocean environment and considers real properties of the seabed, water column, and air-surface interface. However, this is a deterministic simulation that assumes the environment and source are exactly known. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected noise levels within the marine environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. One method is to use Monte Carlo (MC) techniques where simulation results from a large number of deterministic solutions are aggregated to provide statistical properties of the output signal. However, MC methods can be computationally prohibitive since they can require tens of thousands or more simulations to build up an accurate representation of those statistical properties. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a small fraction of the computational cost of MC. We are developing a SPDE solver for the 3-D acoustic wave propagation problem called Paracousti-UQ to help regulators and operators assess the statistical properties of environmental noise produced by MHK devices. In this presentation, we present the SPDE method and compare statistical distributions of simulated acoustic signals in simple models to MC simulations to show the accuracy and efficiency of the SPDE method. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.
NASA Astrophysics Data System (ADS)
Tarasov, A. P.; Egorov, A. I.; Rogatkin, D. A.
2017-07-01
Using multidetector computed tomography, thicknesses of bone squame and soft tissues of human head were assessed. MC simulation revealed impropriety of source-detector separation distances for 3 oximeters, which can cause extracerebral contamination.
Mixing of Isotactic and Syndiotactic Polypropylenes in the Melt
DOE Office of Scientific and Technical Information (OSTI.GOV)
CLANCY,THOMAS C.; PUTZ,MATHIAS; WEINHOLD,JEFFREY D.
2000-07-14
The miscibility of polypropylene (PP) melts in which the chains differ only in stereochemical composition has been investigated by two different procedures. One approach used detailed local information from a Monte Carlo simulation of a single chain, and the other approach takes this information from a rotational isomeric state model devised decades ago, for another purpose. The first approach uses PRISM theory to deduce the intermolecular packing in the polymer blend, while the second approach uses a Monte Carlo simulation of a coarse-grained representation of independent chains, expressed on a high-coordination lattice. Both approaches find a positive energy change uponmore » mixing isotactic PP (iPP) and syndiotactic polypropylene (sPP) chains in the melt. This conclusion is qualitatively consistent with observations published recently by Muelhaupt and coworkers. The size of the energy chain on mixing is smaller in the MC/PRISM approach than in the RIS/MC simulation, with the smaller energy change being in better agreement with the experiment. The RIS/MC simulation finds no demixing for iPP and atactic polypropylene (aPP) in the melt, consistent with several experimental observations in the literature. The demixing of the iPP/sPP blend may arise from attractive interactions in the sPP melt that are disrupted when the sPP chains are diluted with aPP or iPP chains.« less
Mak, Chi H
2015-11-25
While single-stranded (ss) segments of DNAs and RNAs are ubiquitous in biology, details about their structures have only recently begun to emerge. To study ssDNA and RNAs, we have developed a new Monte Carlo (MC) simulation using a free energy model for nucleic acids that has the atomisitic accuracy to capture fine molecular details of the sugar-phosphate backbone. Formulated on the basis of a first-principle calculation of the conformational entropy of the nucleic acid chain, this free energy model correctly reproduced both the long and short length-scale structural properties of ssDNA and RNAs in a rigorous comparison against recent data from fluorescence resonance energy transfer, small-angle X-ray scattering, force spectroscopy and fluorescence correlation transport measurements on sequences up to ∼100 nucleotides long. With this new MC algorithm, we conducted a comprehensive investigation of the entropy landscape of small RNA stem-loop structures. From a simulated ensemble of ∼10(6) equilibrium conformations, the entropy for the initiation of different size RNA hairpin loops was computed and compared against thermodynamic measurements. Starting from seeded hairpin loops, constrained MC simulations were then used to estimate the entropic costs associated with propagation of the stem. The numerical results provide new direct molecular insights into thermodynaimc measurement from macroscopic calorimetry and melting experiments.
McIDAS-eXplorer: A version of McIDAS for planetary applications
NASA Technical Reports Server (NTRS)
Limaye, Sanjay S.; Saunders, R. Stephen; Sromovsky, Lawrence A.; Martin, Michael
1994-01-01
McIDAS-eXplorer is a set of software tools developed for analysis of planetary data published by the Planetary Data System on CD-ROM's. It is built upon McIDAS-X, an environment which has been in use nearly two decades now for earth weather satellite data applications in research and routine operations. The environment allows convenient access, navigation, analysis, display, and animation of planetary data by utilizing the full calibration data accompanying the planetary data. Support currently exists for Voyager images of the giant planets and their satellites; Magellan radar images (F-MIDR and C-MIDR's, global map products (GxDR's), and altimetry data (ARCDR's)); Galileo SSI images of the earth, moon, and Venus; Viking Mars images and MDIM's as well as most earth based telescopic images of solar system objects (FITS). The NAIF/JPL SPICE kernels are used for image navigation when available. For data without the SPICE kernels (such as the bulk of the Voyager Jupiter and Saturn imagery and Pioneer Orbiter images of Venus), tools based on NAIF toolkit allow the user to navigate the images interactively. Multiple navigation types can be attached to a given image (e.g., for ring navigation and planet navigation in the same image). Tools are available to perform common image processing tasks such as digital filtering, cartographic mapping, map overlays, and data extraction. It is also possible to have different planetary radii for an object such as Venus which requires a different radius for the surface and for the cloud level. A graphical user interface based on Tel-Tk scripting language is provided (UNIX only at present) for using the environment and also to provide on-line help. It is possible for end users to add applications of their own to the environment at any time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altsybeev, Igor
2016-01-22
In the present work, Monte-Carlo toy model with repulsing quark-gluon strings in hadron-hadron collisions is described. String repulsion creates transverse boosts for the string decay products, giving modifications of observables. As an example, long-range correlations between mean transverse momenta of particles in two observation windows are studied in MC toy simulation of the heavy-ion collisions.
Vertical Temperature Simulation of Pegasus Runway, McMurdo Station, Antarctica
2015-01-01
Report Approved for public release; distribution is unlimited. Prepared for National Science Foundation , Division of Polar Programs, Antarctic...45 ERDC/CRREL TR-15-2 vii Preface This study was conducted for the National Science Foundation (NSF), Di- vision of Polar...Development Center GPR Ground-Penetrating Radar MIS McMurdo Ice Self NSF National Science Foundation PIR Precision Infrared Radiometer PLR Division of
SU-E-T-535: Proton Dose Calculations in Homogeneous Media.
Chapman, J; Fontenot, J; Newhauser, W; Hogstrom, K
2012-06-01
To develop a pencil beam dose calculation algorithm for scanned proton beams that improves modeling of scatter events. Our pencil beam algorithm (PBA) was developed for calculating dose from monoenergetic, parallel proton beams in homogeneous media. Fermi-Eyges theory was implemented for pencil beam transport. Elastic and nonelastic scatter effects were each modeled as a Gaussian distribution, with root mean square (RMS) widths determined from theoretical calculations and a nonlinear fit to a Monte Carlo (MC) simulated 1mm × 1mm proton beam, respectively. The PBA was commissioned using MC simulations in a flat water phantom. Resulting PBA calculations were compared with results of other models reported in the literature on the basis of differences between PBA and MC calculations of 80-20% penumbral widths. Our model was further tested by comparing PBA and MC results for oblique beams (45 degree incidence) and surface irregularities (step heights of 1 and 4 cm) for energies of 50-250 MeV and field sizes of 4cm × 4cm and 10cm × 10cm. Agreement between PBA and MC distributions was quantified by computing the percentage of points within 2% dose difference or 1mm distance to agreement. Our PBA improved agreement between calculated and simulated penumbral widths by an order of magnitude compared with previously reported values. For comparisons of oblique beams and surface irregularities, agreement between PBA and MC distributions was better than 99%. Our algorithm showed improved accuracy over other models reported in the literature in predicting the overall shape of the lateral profile through the Bragg peak. This improvement was achieved by incorporating nonelastic scatter events into our PBA. The increased modeling accuracy of our PBA, incorporated into a treatment planning system, may improve the reliability of treatment planning calculations for patient treatments. This research was supported by contract W81XWH-10-1-0005 awarded by The U.S. Army Research Acquisition Activity, 820 Chandler Street, Fort Detrick, MD 21702-5014. This report does not necessarily reflect the position or policy of the Government, and no official endorsement should be inferred. © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randeniya, S; Mirkovic, D; Titt, U
2014-06-01
Purpose: In intensity modulated proton therapy (IMPT), energy dependent, protons per monitor unit (MU) calibration factors are important parameters that determine absolute dose values from energy deposition data obtained from Monte Carlo (MC) simulations. Purpose of this study was to assess the sensitivity of MC-computed absolute dose distributions to the protons/MU calibration factors in IMPT. Methods: A “verification plan” (i.e., treatment beams applied individually to water phantom) of a head and neck patient plan was calculated using MC technique. The patient plan had three beams; one posterior-anterior (PA); two anterior oblique. Dose prescription was 66 Gy in 30 fractions. Ofmore » the total MUs, 58% was delivered in PA beam, 25% and 17% in other two. Energy deposition data obtained from the MC simulation were converted to Gy using energy dependent protons/MU calibrations factors obtained from two methods. First method is based on experimental measurements and MC simulations. Second is based on hand calculations, based on how many ion pairs were produced per proton in the dose monitor and how many ion pairs is equal to 1 MU (vendor recommended method). Dose distributions obtained from method one was compared with those from method two. Results: Average difference of 8% in protons/MU calibration factors between method one and two converted into 27 % difference in absolute dose values for PA beam; although dose distributions preserved the shape of 3D dose distribution qualitatively, they were different quantitatively. For two oblique beams, significant difference in absolute dose was not observed. Conclusion: Results demonstrate that protons/MU calibration factors can have a significant impact on absolute dose values in IMPT depending on the fraction of MUs delivered. When number of MUs increases the effect due to the calibration factors amplify. In determining protons/MU calibration factors, experimental method should be preferred in MC dose calculations. Research supported by National Cancer Institute grant P01CA021239.« less
Tao, Min; Xie, Ping; Chen, Jun; Qin, Boqiang; Zhang, Dawen; Niu, Yuan; Zhang, Meng; Wang, Qing; Wu, Laiyan
2012-01-01
Lake Taihu is the third largest freshwater lake in China and is suffering from serious cyanobacterial blooms with the associated drinking water contamination by microcystin (MC) for millions of citizens. So far, most studies on MCs have been limited to two small bays, while systematic research on the whole lake is lacking. To explain the variations in MC concentrations during cyanobacterial bloom, a large-scale survey at 30 sites across the lake was conducted monthly in 2008. The health risks of MC exposure were high, especially in the northern area. Both Microcystis abundance and MC cellular quotas presented positive correlations with MC concentration in the bloom seasons, suggesting that the toxic risks during Microcystis proliferations were affected by variations in both Microcystis density and MC production per Microcystis cell. Use of a powerful predictive modeling tool named generalized additive model (GAM) helped visualize significant effects of abiotic factors related to carbon fixation and proliferation of Microcystis (conductivity, dissolved inorganic carbon (DIC), water temperature and pH) on MC cellular quotas from recruitment period of Microcystis to the bloom seasons, suggesting the possible use of these factors, in addition to Microcystis abundance, as warning signs to predict toxic events in the future. The interesting relationship between macrophytes and MC cellular quotas of Microcystis (i.e., high MC cellular quotas in the presence of macrophytes) needs further investigation. PMID:22384128
NASA Astrophysics Data System (ADS)
Piecuch, Christopher G.; Landerer, Felix W.; Ponte, Rui M.
2018-05-01
Monthly ocean bottom pressure solutions from the Gravity Recovery and Climate Experiment (GRACE), derived using surface spherical cap mass concentration (MC) blocks and spherical harmonics (SH) basis functions, are compared to tide gauge (TG) monthly averaged sea level data over 2003-2015 to evaluate improved gravimetric data processing methods near the coast. MC solutions can explain ≳ 42% of the monthly variance in TG time series over broad shelf regions and in semi-enclosed marginal seas. MC solutions also generally explain ˜5-32 % more TG data variance than SH estimates. Applying a coastline resolution improvement algorithm in the GRACE data processing leads to ˜ 31% more variance in TG records explained by the MC solution on average compared to not using this algorithm. Synthetic observations sampled from an ocean general circulation model exhibit similar patterns of correspondence between modeled TG and MC time series and differences between MC and SH time series in terms of their relationship with TG time series, suggesting that observational results here are generally consistent with expectations from ocean dynamics. This work demonstrates the improved quality of recent MC solutions compared to earlier SH estimates over the coastal ocean, and suggests that the MC solutions could be a useful tool for understanding contemporary coastal sea level variability and change.
Dynamic multi-coil tailored excitation for transmit B1 correction at 7 Tesla.
Umesh Rudrapatna, S; Juchem, Christoph; Nixon, Terence W; de Graaf, Robin A
2016-07-01
Tailored excitation (TEx) based on interspersing multiple radio frequency pulses with linear gradient and higher-order shim pulses can be used to obtain uniform flip angle in the presence of large radio frequency transmission (B 1+) inhomogeneity. Here, an implementation of dynamic, multislice tailored excitation using the recently developed multi-coil nonlinear shim hardware (MC-DTEx) is reported. MC-DTEx was developed and tested both in a phantom and in vivo at 7 T, and its efficacy was quantitatively assessed. Predicted outcomes of MC-DTEx and DTEx based on spherical harmonic shims (SH-DTEx) were also compared. For a planned 30 ° flip angle, in a phantom, the standard deviation in excitation improved from 28% (regular excitation) to 12% with MC-DTEx. The SD in in vivo excitation improved from 22 to 12%. The improvements achieved with experimental MC-DTEx closely matched the theoretical predictions. Simulations further showed that MC-DTEx outperforms SH-DTEx for both scenarios. Successful implementation of multislice MC-DTEx is presented and is shown to be capable of homogenizing excitation over more than twofold B 1+ variations. Its benefits over SH-DTEx are also demonstrated. A distinct advantage of MC hardware over SH shim hardware is the absence of significant eddy current effects, which allows for a straightforward, multislice implementation of MC-DTEx. Magn Reson Med 76:83-93, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Furstoss, C; Reniers, B; Bertrand, M J; Poon, E; Carrier, J-F; Keller, B M; Pignol, J P; Beaulieu, L; Verhaegen, F
2009-05-01
A Monte Carlo (MC) study was carried out to evaluate the effects of the interseed attenuation and the tissue composition for two models of 125I low dose rate (LDR) brachytherapy seeds (Medi-Physics 6711, IBt InterSource) in a permanent breast implant. The effect of the tissue composition was investigated because the breast localization presents heterogeneities such as glandular and adipose tissue surrounded by air, lungs, and ribs. The absolute MC dose calculations were benchmarked by comparison to the absolute dose obtained from experimental results. Before modeling a clinical case of an implant in heterogeneous breast, the effects of the tissue composition and the interseed attenuation were studied in homogeneous phantoms. To investigate the tissue composition effect, the dose along the transverse axis of the two seed models were calculated and compared in different materials. For each seed model, three seeds sharing the same transverse axis were simulated to evaluate the interseed effect in water as a function of the distance from the seed. A clinical study of a permanent breast 125I implant for a single patient was carried out using four dose calculation techniques: (1) A TG-43 based calculation, (2) a full MC simulation with realistic tissues and seed models, (3) a MC simulation in water and modeled seeds, and (4) a MC simulation without modeling the seed geometry but with realistic tissues. In the latter, a phase space file corresponding to the particles emitted from the external surface of the seed is used at each seed location. The results were compared by calculating the relevant clinical metrics V85, V100, and V200 for this kind of treatment in the target. D90 and D50 were also determined to evaluate the differences in dose and compare the results to the studies published for permanent prostate seed implants in literature. The experimental results are in agreement with the MC absolute doses (within 5% for EBT Gafchromic film and within 7% for TLD-100). Important differences between the dose along the transverse axis of the seed in water and in adipose tissue are obtained (10% at 3.5 cm). The comparisons between the full MC and the TG-43 calculations show that there are no significant differences for V85 and V100. For V200, 8.4% difference is found coming mainly from the tissue composition effect. Larger differences (about 10.5% for the model 6711 seed and about 13% for the InterSource125) are determined for D90 and D50. These differences depend on the composition of the breast tissue modeled in the simulation. A variation in percentage by mass of the mammary gland and adipose tissue can cause important differences in the clinical dose metrics V200, D90, and D50. Even if the authors can conclude that clinically, the differences in V85, V100, and V200 are acceptable in comparison to the large variation in dose in the treated volume, this work demonstrates that the development of a MC treatment planning system for LDR brachytherapy will improve the dose determination in the treated region and consequently the dose-outcome relationship, especially for the skin toxicity.
Monte Carlo modeling of a conventional X-ray computed tomography scanner for gel dosimetry purposes.
Hayati, Homa; Mesbahi, Asghar; Nazarpoor, Mahmood
2016-01-01
Our purpose in the current study was to model an X-ray CT scanner with the Monte Carlo (MC) method for gel dosimetry. In this study, a conventional CT scanner with one array detector was modeled with use of the MCNPX MC code. The MC calculated photon fluence in detector arrays was used for image reconstruction of a simple water phantom as well as polyacrylamide polymer gel (PAG) used for radiation therapy. Image reconstruction was performed with the filtered back-projection method with a Hann filter and the Spline interpolation method. Using MC results, we obtained the dose-response curve for images of irradiated gel at different absorbed doses. A spatial resolution of about 2 mm was found for our simulated MC model. The MC-based CT images of the PAG gel showed a reliable increase in the CT number with increasing absorbed dose for the studied gel. Also, our results showed that the current MC model of a CT scanner can be used for further studies on the parameters that influence the usability and reliability of results, such as the photon energy spectra and exposure techniques in X-ray CT gel dosimetry.
Thillaivanam, Saravanapriya; Amin, Arwa M; Gopalakrishnan, Sheila; Ibrahim, Baharudin
2016-10-01
Sore throats may be due to either viral or group A beta hemolytic streptococcus (GABHS) infections; but diagnosis of the etiology of a sore throat is difficult, often leading to unnecessary antibiotic prescriptions and consequent increases in bacterial resistance. Scoring symptoms using the McIsaac clinical decision rule can help physicians to diagnose and manage streptococcal infections leading to sore throat and have been recommended by the Ministry of Health, Malaysia. In this paper, we offer the first assessment of the effectiveness of the McIsaac rule in a clinical setting in Malaysia. This study is a retrospective review of 116 pediatric patients presenting with sore throat. Group A comprised patients before the implementation of the McIsaac rule and Group B comprised patients after the implementation. Unnecessary throat swab cultures were reduced by 40% (P = 0.003). Redundant antibiotic prescriptions were reduced by 26.5% (P = 0.003) and the overall use of antibiotics was reduced by 22.1% (P = 0.003). The pediatricians' compliance rate to McIsaac rule criteria was 45% before implementation of the McIsaac rule, but improved to 67.9% (P = 0.0005) after implementation. The McIsaac rule is an effective tool for the management of sore throat in children in Malaysia.
Should adhesive debonding be simulated for intra-radicular post stress analyses?
Caldas, Ricardo A; Bacchi, Atais; Barão, Valentim A R; Versluis, Antheunis
2018-06-23
Elucidate the influence of debonding on stress distribution and maximum stresses for intra-radicular restorations. Five intra-radicular restorations were analyzed by finite element analysis (FEA): MP=metallic cast post core; GP=glass fiber post core; PP=pre-fabricated metallic post core; RE=resin endocrowns; CE=single piece ceramic endocrown. Two cervical preparations were considered: no ferule (f 0 ) and 2mm ferule (f 1 ). The simulation was conducted in three steps: (1) intact bonds at all contacts; (2) bond failure between crown and tooth; (3) bond failure among tooth, post and crown interfaces. Contact friction and separation between interfaces was modeled where bond failure occurred. Mohr-Coulomb stress ratios (σ MC ratio ) and fatigue safety factors (SF) for dentin structure were compared with published strength values, fatigue life, and fracture patterns of teeth with intra-radicular restorations. The σ MC ratio showed no differences among models at first step. The second step increased σ MC ratio at the ferule compared to step 1. At the third step, the σ MC ratio and SF for f 0 models were highly influenced by post material. CE and RE models had the highest values for σ MC ratio and lower SF. MP had the lowest σ MC ratio and higher SF. The f 1 models showed no relevant differences among them at the third step. FEA most closely predicted failure performance of intra-radicular posts when frictional contact was modeled. Results of analyses where all interfaces are assumed to be perfectly bonded should be considered with caution. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.
Featured Image: Mixing Chemicals in Stars
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2017-10-01
How do stars mix chemicals in their interiors, leading to the abundances we measure at their surfaces? Two scientists from the Planetary Science Institute in Arizona, Tamara Rogers (Newcastle University, UK) and Jim McElwaine (Durham University, UK), have investigated the role that internal gravity waves have in chemical mixing in stellar interiors. Internal gravity waves not to be confused with the currently topical gravitational waves are waves that oscillate within a fluid that has a density gradient. Rogers and McElwaine used simulations to explore how these waves can cause particles in a stars interior to move around, gradually mixing the different chemical elements. Snapshots from four different times in their simulation can be seen below, with the white dots marking tracer particles and the colors indicating vorticity. You can see how the particles move in response to wave motion after the first panel. For more information, check out the paper below!CitationT. M. Rogers and J. N. McElwaine 2017 ApJL 848 L1. doi:10.3847/2041-8213/aa8d13
The High performance of nanocrystalline CVD diamond coated hip joints in wear simulator test.
Maru, M M; Amaral, M; Rodrigues, S P; Santos, R; Gouvea, C P; Archanjo, B S; Trommer, R M; Oliveira, F J; Silva, R F; Achete, C A
2015-09-01
The superior biotribological performance of nanocrystalline diamond (NCD) coatings grown by a chemical vapor deposition (CVD) method was already shown to demonstrate high wear resistance in ball on plate experiments under physiological liquid lubrication. However, tests with a close-to-real approach were missing and this constitutes the aim of the present work. Hip joint wear simulator tests were performed with cups and heads made of silicon nitride coated with NCD of ~10 μm in thickness. Five million testing cycles (Mc) were run, which represent nearly five years of hip joint implant activity in a patient. For the wear analysis, gravimetry, profilometry, scanning electron microscopy and Raman spectroscopy techniques were used. After 0.5 Mc of wear test, truncation of the protruded regions of the NCD film happened as a result of a fine-scale abrasive wear mechanism, evolving to extensive plateau regions and highly polished surface condition (Ra<10nm). Such surface modification took place without any catastrophic features as cracking, grain pullouts or delamination of the coatings. A steady state volumetric wear rate of 0.02 mm(3)/Mc, equivalent to a linear wear of 0.27 μm/Mc favorably compares with the best performance reported in the literature for the fourth generation alumina ceramic (0.05 mm(3)/Mc). Also, squeaking, quite common phenomenon in hard-on-hard systems, was absent in the present all-NCD system. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bourasseau, Emeric; Dubois, Vincent; Desbiens, Nicolas; Maillet, Jean-Bernard
2007-06-01
The simultaneous use of the Reaction Ensemble Monte Carlo (ReMC) method and the Adaptative Erpenbeck EOS (AE-EOS) method allows us to calculate direclty the thermodynamical and chemical equilibrium of a mixture on the hugoniot curve. The ReMC method allow to reach chemical equilibrium of detonation products and the AE-EOS method constraints ths system to satisfy the Hugoniot relation. Once the Crussard curve of detonation products has been established, CJ state properties may be calculated. An additional NPT simulation is performed at CJ conditions in order to compute derivative thermodynamic quantities like Cp, Cv, Gruneisen gama, sound velocity, and compressibility factor. Several explosives has been studied, of which PETN, nitromethane, tetranitromethane, and hexanitroethane. In these first simulations, solid carbon is eventually treated using an EOS.
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
Traffic accident simulation : final report.
DOT National Transportation Integrated Search
1992-06-01
The purpose of this research was to determine if HVOSM (Highway Vehicle Object Simulation Model) could be used to model a vehicle with a modern front (or rear) suspension system such as a McPherson strut and have the results of the dynamic model be v...
NASA Astrophysics Data System (ADS)
Lin, Y.; Wukitch, S. J.; Edlund, E.; Ennever, P.; Hubbard, A. E.; Porkolab, M.; Rice, J.; Wright, J.
2017-10-01
In recent three-ion species (majority D and H plus a trace level of 3He) ICRF heating experiments on Alcator C-Mod, double mode conversion on both sides of the 3He cyclotron resonance has been observed using the phase contrast imaging (PCI) system. The MC locations are used to estimate the species concentrations in the plasma. Simulation using TORIC shows that with the 3He level <1%, most RF power is absorbed by the 3He ions and the process can generate energetic 3He ions. In mode conversion (MC) flow drive experiment in D(3He) plasma at 8 T, MC waves were also monitored by PCI. The MC ion cyclotron wave (ICW) amplitude and wavenumber kR have been found to correlate with the flow drive force. The MC efficiency, wave-number k of the MC ICW and their dependence on plasma parameters like Te0 have been studied. Based on the experimental observation and numerical study of the dispersion solutions, a hypothesis of the flow drive mechanism has been proposed.
Siebers, Jeffrey V
2008-04-04
Monte Carlo (MC) is rarely used for IMRT plan optimization outside of research centres due to the extensive computational resources or long computation times required to complete the process. Time can be reduced by degrading the statistical precision of the MC dose calculation used within the optimization loop. However, this eventually introduces optimization convergence errors (OCEs). This study determines the statistical noise levels tolerated during MC-IMRT optimization under the condition that the optimized plan has OCEs <100 cGy (1.5% of the prescription dose) for MC-optimized IMRT treatment plans.Seven-field prostate IMRT treatment plans for 10 prostate patients are used in this study. Pre-optimization is performed for deliverable beams with a pencil-beam (PB) dose algorithm. Further deliverable-based optimization proceeds using: (1) MC-based optimization, where dose is recomputed with MC after each intensity update or (2) a once-corrected (OC) MC-hybrid optimization, where a MC dose computation defines beam-by-beam dose correction matrices that are used during a PB-based optimization. Optimizations are performed with nominal per beam MC statistical precisions of 2, 5, 8, 10, 15, and 20%. Following optimizer convergence, beams are re-computed with MC using 2% per beam nominal statistical precision and the 2 PTV and 10 OAR dose indices used in the optimization objective function are tallied. For both the MC-optimization and OC-optimization methods, statistical equivalence tests found that OCEs are less than 1.5% of the prescription dose for plans optimized with nominal statistical uncertainties of up to 10% per beam. The achieved statistical uncertainty in the patient for the 10% per beam simulations from the combination of the 7 beams is ~3% with respect to maximum dose for voxels with D>0.5D(max). The MC dose computation time for the OC-optimization is only 6.2 minutes on a single 3 Ghz processor with results clinically equivalent to high precision MC computations.
NASA Astrophysics Data System (ADS)
Laura, Jason; Skinner, James A.; Hunter, Marc A.
2017-08-01
In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.
Stochastic dynamics for reinfection by transmitted diseases
NASA Astrophysics Data System (ADS)
Barros, Alessandro S.; Pinho, Suani T. R.
2017-06-01
The use of stochastic models to study the dynamics of infectious diseases is an important tool to understand the epidemiological process. For several directly transmitted diseases, reinfection is a relevant process, which can be expressed by endogenous reactivation of the pathogen or by exogenous reinfection due to direct contact with an infected individual (with smaller reinfection rate σ β than infection rate β ). In this paper, we examine the stochastic susceptible, infected, recovered, infected (SIRI) model simulating the endogenous reactivation by a spontaneous reaction, while exogenous reinfection by a catalytic reaction. Analyzing the mean-field approximations of a site and pairs of sites, and Monte Carlo (MC) simulations for the particular case of exogenous reinfection, we obtained continuous phase transitions involving endemic, epidemic, and no transmission phases for the simple approach; the approach of pairs is better to describe the phase transition from endemic phase (susceptible, infected, susceptible (SIS)-like model) to epidemic phase (susceptible, infected, and removed or recovered (SIR)-like model) considering the comparison with MC results; the reinfection increases the peaks of outbreaks until the system reaches endemic phase. For the particular case of endogenous reactivation, the approach of pairs leads to a continuous phase transition from endemic phase (SIS-like model) to no transmission phase. Finally, there is no phase transition when both effects are taken into account. We hope the results of this study can be generalized for the susceptible, exposed, infected, and removed or recovered (SEIRIE) model, for which the state exposed (infected but not infectious), describing more realistically transmitted diseases such as tuberculosis. In future work, we also intend to investigate the effect of network topology on phase transitions when the SIRI model describes both transmitted diseases (σ <1 ) and social contagions (σ >1 ).
Proton-induced x-ray fluorescence CT imaging
Bazalova-Carter, Magdalena; Ahmad, Moiz; Matsuura, Taeko; Takao, Seishin; Matsuo, Yuto; Fahrig, Rebecca; Shirato, Hiroki; Umegaki, Kikuo; Xing, Lei
2015-01-01
Purpose: To demonstrate the feasibility of proton-induced x-ray fluorescence CT (pXFCT) imaging of gold in a small animal sized object by means of experiments and Monte Carlo (MC) simulations. Methods: First, proton-induced gold x-ray fluorescence (pXRF) was measured as a function of gold concentration. Vials of 2.2 cm in diameter filled with 0%–5% Au solutions were irradiated with a 220 MeV proton beam and x-ray fluorescence induced by the interaction of protons, and Au was detected with a 3 × 3 mm2 CdTe detector placed at 90° with respect to the incident proton beam at a distance of 45 cm from the vials. Second, a 7-cm diameter water phantom containing three 2.2-diameter vials with 3%–5% Au solutions was imaged with a 7-mm FWHM 220 MeV proton beam in a first generation CT scanning geometry. X-rays scattered perpendicular to the incident proton beam were acquired with the CdTe detector placed at 45 cm from the phantom positioned on a translation/rotation stage. Twenty one translational steps spaced by 3 mm at each of 36 projection angles spaced by 10° were acquired, and pXFCT images of the phantom were reconstructed with filtered back projection. A simplified geometry of the experimental data acquisition setup was modeled with the MC TOPAS code, and simulation results were compared to the experimental data. Results: A linear relationship between gold pXRF and gold concentration was observed in both experimental and MC simulation data (R2 > 0.99). All Au vials were apparent in the experimental and simulated pXFCT images. Specifically, the 3% Au vial was detectable in the experimental [contrast-to-noise ratio (CNR) = 5.8] and simulated (CNR = 11.5) pXFCT image. Due to fluorescence x-ray attenuation in the higher concentration vials, the 4% and 5% Au contrast were underestimated by 10% and 15%, respectively, in both the experimental and simulated pXFCT images. Conclusions: Proton-induced x-ray fluorescence CT imaging of 3%–5% gold solutions in a small animal sized water phantom has been demonstrated for the first time by means of experiments and MC simulations. PMID:25652502
Electrolytes in a nanometer slab-confinement: Ion-specific structure and solvation forces
NASA Astrophysics Data System (ADS)
Kalcher, Immanuel; Schulz, Julius C. F.; Dzubiella, Joachim
2010-10-01
We study the liquid structure and solvation forces of dense monovalent electrolytes (LiCl, NaCl, CsCl, and NaI) in a nanometer slab-confinement by explicit-water molecular dynamics (MD) simulations, implicit-water Monte Carlo (MC) simulations, and modified Poisson-Boltzmann (PB) theories. In order to consistently coarse-grain and to account for specific hydration effects in the implicit methods, realistic ion-ion and ion-surface pair potentials have been derived from infinite-dilution MD simulations. The electrolyte structure calculated from MC simulations is in good agreement with the corresponding MD simulations, thereby validating the coarse-graining approach. The agreement improves if a realistic, MD-derived dielectric constant is employed, which partially corrects for (water-mediated) many-body effects. Further analysis of the ionic structure and solvation pressure demonstrates that nonlocal extensions to PB (NPB) perform well for a wide parameter range when compared to MC simulations, whereas all local extensions mostly fail. A Barker-Henderson mapping of the ions onto a charged, asymmetric, and nonadditive binary hard-sphere mixture shows that the strength of structural correlations is strongly related to the magnitude and sign of the salt-specific nonadditivity. Furthermore, a grand canonical NPB analysis shows that the Donnan effect is dominated by steric correlations, whereas solvation forces and overcharging effects are mainly governed by ion-surface interactions. However, steric corrections to solvation forces are strongly repulsive for high concentrations and low surface charges, while overcharging can also be triggered by steric interactions in strongly correlated systems. Generally, we find that ion-surface and ion-ion correlations are strongly coupled and that coarse-grained methods should include both, the latter nonlocally and nonadditive (as given by our specific ionic diameters), when studying electrolytes in highly inhomogeneous situations.
On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.
Thomson, Rowan M; Kawrakow, Iwan
2011-08-01
The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
NASA Astrophysics Data System (ADS)
Marek, Repka
2015-01-01
The original McEliece PKC proposal is interesting thanks to its resistance against all known attacks, even using quantum cryptanalysis, in an IND-CCA2 secure conversion. Here we present a generic implementation of the original McEliece PKC proposal, which provides test vectors (for all important intermediate results), and also in which a measurement tool for side-channel analysis is employed. To our best knowledge, this is the first such an implementation. This Calculator is valuable in implementation optimization, in further McEliece/Niederreiter like PKCs properties investigations, and also in teaching. Thanks to that, one can, for example, examine side-channel vulnerability of a certain implementation, or one can find out and test particular parameters of the cryptosystem in order to make them appropriate for an efficient hardware implementation. This implementation is available [1] in executable binary format, and as a static C++ library, as well as in form of source codes, for Linux and Windows operating systems.
2017-12-08
Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html Exploration Systems Project Manager Mike Weiss speaks about a Hubble Servicing Mission hand tool, developed at Goddard. Credit: NASA/GSFC/Debbie McCallum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nusrat, H; Pang, G; Sarfehnia, A
Purpose: This work seeks to develop a beam quality meter using multiple differently doped plastic scintillators that are thus intrinsically beam-quality dependent. Plastic scintillators spontaneously emit visible light upon irradiation; the amount of light produced is dependent on stopping power (closely related to LET) according to Birks’ law. Doping plastic scintillators can be used to tune their sensitivity to specific LET ranges. Methods: GEANT4.10.1 Monte Carlo (MC) was used to evaluate the response of various scintillator dopant combinations. MC radiation transport and scintillator light response were validated against previously published literature. Current work involves evaluating detector response experimentally; to thatmore » end, a detector prototype with interchangeable scintillator housing was constructed. Measurement set-up guides light emitted by the scintillator to a photomultiplier tube via a glass taper junction coupled to an optical fiber. The resulting signal is measured by an electrometer, and normalized to dose readout from a diode. Measurements have been done using clinical electron and orthovoltage beams. MC response (simulated scintillator light normalized to dose scored inside the scintillating volume) was evaluated for four different LET radiations for an undoped and 1%Pb doped scintillator (σ=0.85%). Simulated incident electrons included: 0.05, 0.1, 0.2, 6, 12, and 18 MeV; these energies correspond to a range of stopping power (related to LET) values ranging from 1.824 to 11.09 MeVcm{sup 2}g{sup −1} (SCOL from NIST-ESTAR). Results: Initial MC results show a distinct divergence in scintillator response as LET increases. The response for undoped plastic scintillator indicated a 35.0% increase in signal when going from 18 MeV (low LET) to 0.05 MeV (high LET) while 1%-Pb doped scintillator indicated a 100.9% increase. Conclusion: After validating MC against measurement, simulations will be used to test various concentrations (2%, 4%, 6%) of different high-Z material dopants (W, Mo) to optimize the scintillator types for the beam quality meter. NSERC Discovery Grant RGPIN-435608.« less
Gu, Yanqing; Wang, Qing; Cui, Weiding; Fan, Weimin
2012-01-01
Background Recent studies have shown that the acetabular component frequently becomes deformed during press-fit insertion. The aim of this study was to explore the deformation of the Durom cup after implantation and to clarify the impact of deformation on wear and ion release of the Durom large head metal-on-metal (MOM) total hips in simulators. Methods Six Durom cups impacted into reamed acetabula of fresh cadavers were used as the experimental group and another 6 size-paired intact Durom cups constituted the control group. All 12 Durom MOM total hips were put through a 3 million cycle (MC) wear test in simulators. Results The 6 cups in the experimental group were all deformed, with a mean deformation of 41.78±8.86 µm. The average volumetric wear rate in the experimental group and in the control group in the first million cycle was 6.65±0.29 mm3/MC and 0.89±0.04 mm3/MC (t = 48.43, p = 0.000). The ion levels of Cr and Co in the experimental group were also higher than those in the control group before 2.0 MC. However there was no difference in the ion levels between 2.0 and 3.0 MC. Conclusions This finding implies that the non-modular acetabular component of Durom total hip prosthesis is likely to become deformed during press-fit insertion, and that the deformation will result in increased volumetric wear and increased ion release. Clinical Relevance This study was determined to explore the deformation of the Durom cup after implantation and to clarify the impact of deformation on wear and ion release of the prosthesis. Deformation of the cup after implantation increases the wear of MOM bearings and the resulting ion levels. The clinical use of the Durom large head prosthesis should be with great care. PMID:23144694
Liu, Feng; Chen, Zhefeng; Gu, Yanqing; Wang, Qing; Cui, Weiding; Fan, Weimin
2012-01-01
Recent studies have shown that the acetabular component frequently becomes deformed during press-fit insertion. The aim of this study was to explore the deformation of the Durom cup after implantation and to clarify the impact of deformation on wear and ion release of the Durom large head metal-on-metal (MOM) total hips in simulators. Six Durom cups impacted into reamed acetabula of fresh cadavers were used as the experimental group and another 6 size-paired intact Durom cups constituted the control group. All 12 Durom MOM total hips were put through a 3 million cycle (MC) wear test in simulators. The 6 cups in the experimental group were all deformed, with a mean deformation of 41.78 ± 8.86 µm. The average volumetric wear rate in the experimental group and in the control group in the first million cycle was 6.65 ± 0.29 mm(3)/MC and 0.89 ± 0.04 mm(3)/MC (t = 48.43, p = 0.000). The ion levels of Cr and Co in the experimental group were also higher than those in the control group before 2.0 MC. However there was no difference in the ion levels between 2.0 and 3.0 MC. This finding implies that the non-modular acetabular component of Durom total hip prosthesis is likely to become deformed during press-fit insertion, and that the deformation will result in increased volumetric wear and increased ion release. This study was determined to explore the deformation of the Durom cup after implantation and to clarify the impact of deformation on wear and ion release of the prosthesis. Deformation of the cup after implantation increases the wear of MOM bearings and the resulting ion levels. The clinical use of the Durom large head prosthesis should be with great care.
Qin, Nan; Shen, Chenyang; Tsai, Min-Yu; Pinto, Marco; Tian, Zhen; Dedes, Georgios; Pompos, Arnold; Jiang, Steve B; Parodi, Katia; Jia, Xun
2018-01-01
One of the major benefits of carbon ion therapy is enhanced biological effectiveness at the Bragg peak region. For intensity modulated carbon ion therapy (IMCT), it is desirable to use Monte Carlo (MC) methods to compute the properties of each pencil beam spot for treatment planning, because of their accuracy in modeling physics processes and estimating biological effects. We previously developed goCMC, a graphics processing unit (GPU)-oriented MC engine for carbon ion therapy. The purpose of the present study was to build a biological treatment plan optimization system using goCMC. The repair-misrepair-fixation model was implemented to compute the spatial distribution of linear-quadratic model parameters for each spot. A treatment plan optimization module was developed to minimize the difference between the prescribed and actual biological effect. We used a gradient-based algorithm to solve the optimization problem. The system was embedded in the Varian Eclipse treatment planning system under a client-server architecture to achieve a user-friendly planning environment. We tested the system with a 1-dimensional homogeneous water case and 3 3-dimensional patient cases. Our system generated treatment plans with biological spread-out Bragg peaks covering the targeted regions and sparing critical structures. Using 4 NVidia GTX 1080 GPUs, the total computation time, including spot simulation, optimization, and final dose calculation, was 0.6 hour for the prostate case (8282 spots), 0.2 hour for the pancreas case (3795 spots), and 0.3 hour for the brain case (6724 spots). The computation time was dominated by MC spot simulation. We built a biological treatment plan optimization system for IMCT that performs simulations using a fast MC engine, goCMC. To the best of our knowledge, this is the first time that full MC-based IMCT inverse planning has been achieved in a clinically viable time frame. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less
EDITORIAL: International Workshop on Current Topics in Monte Carlo Treatment Planning
NASA Astrophysics Data System (ADS)
Verhaegen, Frank; Seuntjens, Jan
2005-03-01
The use of Monte Carlo particle transport simulations in radiotherapy was pioneered in the early nineteen-seventies, but it was not until the eighties that they gained recognition as an essential research tool for radiation dosimetry, health physics and later on for radiation therapy treatment planning. Since the mid-nineties, there has been a boom in the number of workers using MC techniques in radiotherapy, and the quantity of papers published on the subject. Research and applications of MC techniques in radiotherapy span a very wide range from fundamental studies of cross sections and development of particle transport algorithms, to clinical evaluation of treatment plans for a variety of radiotherapy modalities. The International Workshop on Current Topics in Monte Carlo Treatment Planning took place at Montreal General Hospital, which is part of McGill University, halfway up Mount Royal on Montreal Island. It was held from 3-5 May, 2004, right after the freezing winter has lost its grip on Canada. About 120 workers attended the Workshop, representing 18 countries. Most of the pioneers in the field were present but also a large group of young scientists. In a very full programme, 41 long papers were presented (of which 12 were invited) and 20 posters were on display during the whole meeting. The topics covered included the latest developments in MC algorithms, statistical issues, source modelling and MC treatment planning for photon, electron and proton treatments. The final day was entirely devoted to clinical implementation issues. Monte Carlo radiotherapy treatment planning has only now made a slow entrée in the clinical environment, taking considerably longer than envisaged ten years ago. Of the twenty-five papers in this dedicated special issue, about a quarter deal with this topic, with probably many more studies to follow in the near future. If anything, we hope the Workshop served as an accelerator for more clinical evaluation of MC applications. The remainder of the papers in this issue demonstrate that there is still plenty of work to be undertaken on other topics such as source modelling, calculation speed, data analysis, and development of user-friendly applications. We acknowledge the financial support of the National Cancer Institute of Canada, the Institute of Cancer Research of the Canadian Institutes of Health Research, the Research Grants Office and the Post Graduate Student Society of McGill University, and the Institute of Physics Publishing (IOPP). A final word of thanks goes out to all of those who contributed to the successful Workshop: our local medical physics students and staff, the many colleagues who acted as guest associate editors for the reviewing process, the IOPP staff, and the authors who generated new and exciting work.
NASA Astrophysics Data System (ADS)
Mirzaeinia, Ali; Feyzi, Farzaneh; Hashemianzadeh, Seyed Majid
2018-03-01
Based on Wertheim's second order thermodynamic perturbation theory (TPT2), equations of state (EOSs) are presented for the fluid and solid phases of tangent, freely jointed spheres. It is considered that the spheres interact with each other through the Weeks-Chandler-Anderson (WCA) potential. The developed TPT2 EOS is the sum of a monomeric reference term and a perturbation contribution due to bonding. MC NVT simulations are performed to determine the structural properties of the reference system in the reduced temperature range of 0.6 ≤ T* ≤ 4.0 and the packing fraction range of 0.1 ≤ η ≤ 0.72. Mathematical functions are fitted to the simulation results of the reference system and employed in the framework of Wertheim's theory to develop TPT2 EOSs for the fluid and solid phases. The extended EOSs are compared to the MC NPT simulation results of the compressibility factor and internal energy of the fully flexible chain systems. Simulations are performed for the WCA chain system for chain lengths of up to 15 at T* = 1.0, 1.5, 2.0, 3.0. Across all the reduced temperatures, the agreement between the results of the TPT2 EOS and MC simulations is remarkable. Overall Average Absolute Relative Percent Deviation at T* = 1.0 for the compressibility factor in the entire chain lengths we covered is 0.51 and 0.77 for the solid and fluid phases, respectively. Similar features are observed in the case of residual internal energy.
Mirzaeinia, Ali; Feyzi, Farzaneh; Hashemianzadeh, Seyed Majid
2018-03-14
Based on Wertheim's second order thermodynamic perturbation theory (TPT2), equations of state (EOSs) are presented for the fluid and solid phases of tangent, freely jointed spheres. It is considered that the spheres interact with each other through the Weeks-Chandler-Anderson (WCA) potential. The developed TPT2 EOS is the sum of a monomeric reference term and a perturbation contribution due to bonding. MC NVT simulations are performed to determine the structural properties of the reference system in the reduced temperature range of 0.6 ≤ T* ≤ 4.0 and the packing fraction range of 0.1 ≤ η ≤ 0.72. Mathematical functions are fitted to the simulation results of the reference system and employed in the framework of Wertheim's theory to develop TPT2 EOSs for the fluid and solid phases. The extended EOSs are compared to the MC NPT simulation results of the compressibility factor and internal energy of the fully flexible chain systems. Simulations are performed for the WCA chain system for chain lengths of up to 15 at T* = 1.0, 1.5, 2.0, 3.0. Across all the reduced temperatures, the agreement between the results of the TPT2 EOS and MC simulations is remarkable. Overall Average Absolute Relative Percent Deviation at T* = 1.0 for the compressibility factor in the entire chain lengths we covered is 0.51 and 0.77 for the solid and fluid phases, respectively. Similar features are observed in the case of residual internal energy.
Sadeghi, Mohammad Hosein; Sina, Sedigheh; Mehdizadeh, Amir; Faghihi, Reza; Moharramzadeh, Vahed; Meigooni, Ali Soleimani
2018-02-01
The dosimetry procedure by simple superposition accounts only for the self-shielding of the source and does not take into account the attenuation of photons by the applicators. The purpose of this investigation is an estimation of the effects of the tandem and ovoid applicator on dose distribution inside the phantom by MCNP5 Monte Carlo simulations. In this study, the superposition method is used for obtaining the dose distribution in the phantom without using the applicator for a typical gynecological brachytherapy (superposition-1). Then, the sources are simulated inside the tandem and ovoid applicator to identify the effect of applicator attenuation (superposition-2), and the dose at points A, B, bladder, and rectum were compared with the results of superposition. The exact dwell positions, times of the source, and positions of the dosimetry points were determined in images of a patient and treatment data of an adult woman patient from a cancer center. The MCNP5 Monte Carlo (MC) code was used for simulation of the phantoms, applicators, and the sources. The results of this study showed no significant differences between the results of superposition method and the MC simulations for different dosimetry points. The difference in all important dosimetry points was found to be less than 5%. According to the results, applicator attenuation has no significant effect on the calculated points dose, the superposition method, adding the dose of each source obtained by the MC simulation, can estimate the dose to points A, B, bladder, and rectum with good accuracy.
Toward GPGPU accelerated human electromechanical cardiac simulations
Vigueras, Guillermo; Roy, Ishani; Cookson, Andrew; Lee, Jack; Smith, Nicolas; Nordsletten, David
2014-01-01
In this paper, we look at the acceleration of weakly coupled electromechanics using the graphics processing unit (GPU). Specifically, we port to the GPU a number of components of Heart—a CPU-based finite element code developed for simulating multi-physics problems. On the basis of a criterion of computational cost, we implemented on the GPU the ODE and PDE solution steps for the electrophysiology problem and the Jacobian and residual evaluation for the mechanics problem. Performance of the GPU implementation is then compared with single core CPU (SC) execution as well as multi-core CPU (MC) computations with equivalent theoretical performance. Results show that for a human scale left ventricle mesh, GPU acceleration of the electrophysiology problem provided speedups of 164 × compared with SC and 5.5 times compared with MC for the solution of the ODE model. Speedup of up to 72 × compared with SC and 2.6 × compared with MC was also observed for the PDE solve. Using the same human geometry, the GPU implementation of mechanics residual/Jacobian computation provided speedups of up to 44 × compared with SC and 2.0 × compared with MC. © 2013 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons, Ltd. PMID:24115492
Performance Analysis of HF Band FB-MC-SS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny
Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give formore » BER that closely match the simulated performance in most situations.« less
McStas-model of the delft SESANS
NASA Astrophysics Data System (ADS)
Knudsen, E.; Udby, L.; Willendrup, P. K.; Lefmann, K.; Bouwman, W. G.
2011-06-01
We present simulation results taking first virtual data from a model of the Spin-Echo Small Angle Scattering (SESANS) instrument situated in Delft, in the framework of the McStas Monte Carlo software package. The main focus has been on making a model of the Delft SESANS instrument, and we can now present the first virtual data from it, using a refracting prism-like sample model. In consequence, polarisation instrumentation is now included natively in the McStas kernel, including options for magnetic fields and a number of utility components. This development has brought us to a point where realistic models of polarisation-enabled instrumentation can be built.
Data decomposition of Monte Carlo particle transport simulations via tally servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit
An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithmmore » in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations.« less
Peter, Emanuel K; Shea, Joan-Emma; Pivkin, Igor V
2016-05-14
In this paper, we present a coarse replica exchange molecular dynamics (REMD) approach, based on kinetic Monte Carlo (kMC). The new development significantly can reduce the amount of replicas and the computational cost needed to enhance sampling in protein simulations. We introduce 2 different methods which primarily differ in the exchange scheme between the parallel ensembles. We apply this approach on folding of 2 different β-stranded peptides: the C-terminal β-hairpin fragment of GB1 and TrpZip4. Additionally, we use the new simulation technique to study the folding of TrpCage, a small fast folding α-helical peptide. Subsequently, we apply the new methodology on conformation changes in signaling of the light-oxygen voltage (LOV) sensitive domain from Avena sativa (AsLOV2). Our results agree well with data reported in the literature. In simulations of dialanine, we compare the statistical sampling of the 2 techniques with conventional REMD and analyze their performance. The new techniques can reduce the computational cost of REMD significantly and can be used in enhanced sampling simulations of biomolecules.
Metabolite-cycled STEAM and semi-LASER localization for MR spectroscopy of the human brain at 9.4T.
Giapitzakis, Ioannis-Angelos; Shao, Tingting; Avdievich, Nikolai; Mekle, Ralf; Kreis, Roland; Henning, Anke
2018-04-01
Metabolite cycling (MC) is an MRS technique for the simultaneous acquisition of water and metabolite spectra that avoids chemical exchange saturation transfer effects and for which water may serve as a reference signal or contain additional information in functional or diffusion studies. Here, MC was developed for human investigations at ultrahigh field. MC-STEAM and MC-semi-LASER are introduced at 9.4T with an optimized inversion pulse and elaborate coil setup. Experimental and simulation results are given for the implementation of adiabatic inversion pulses for MC. The two techniques are compared, and the effect of frequency and phase correction based on the MC water spectra is evaluated. Finally, absolute quantification of metabolites is performed. The proposed coil configuration results in a maximum B1 + of 48 μΤ in a voxel within the occipital lobe. Frequency and phase correction of single acquisitions improve signal-to-noise ratio (SNR) and linewidth, leading to high-resolution spectra. The improvement of SNR of N-acetylaspartate (SNR NAA ) for frequency aligned data, acquired with MC-STEAM and MC-semi-LASER, are 37% and 30%, respectively (P < 0.05). Moreover, a doubling of the SNR NAA for MC-semi-LASER in comparison with MC-STEAM is observed (P < 0.05). Concentration levels for 18 metabolites from the human occipital lobe are reported, as acquired with both MC-STEAM and MC-semi-LASER. This work introduces a novel methodology for single-voxel MRS on a 9.4T whole-body scanner and highlights the advantages of semi-LASER compared to STEAM in terms of excitation profile. In comparison with MC-STEAM, MC-semi-LASER yields spectra with higher SNR. Magn Reson Med 79:1841-1850, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.
Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P
2018-01-04
Full Monte Carlo (MC)-based SPECT reconstructions have a strong potential for correcting for image degrading factors, but the reconstruction times are long. The objective of this study was to develop a highly parallel Monte Carlo code for fast, ordered subset expectation maximum (OSEM) reconstructions of SPECT/CT images. The MC code was written in the Compute Unified Device Architecture language for a computer with four graphics processing units (GPUs) (GeForce GTX Titan X, Nvidia, USA). This enabled simulations of parallel photon emissions from the voxels matrix (128 3 or 256 3 ). Each computed tomography (CT) number was converted to attenuation coefficients for photo absorption, coherent scattering, and incoherent scattering. For photon scattering, the deflection angle was determined by the differential scattering cross sections. An angular response function was developed and used to model the accepted angles for photon interaction with the crystal, and a detector scattering kernel was used for modeling the photon scattering in the detector. Predefined energy and spatial resolution kernels for the crystal were used. The MC code was implemented in the OSEM reconstruction of clinical and phantom 177 Lu SPECT/CT images. The Jaszczak image quality phantom was used to evaluate the performance of the MC reconstruction in comparison with attenuated corrected (AC) OSEM reconstructions and attenuated corrected OSEM reconstructions with resolution recovery corrections (RRC). The performance of the MC code was 3200 million photons/s. The required number of photons emitted per voxel to obtain a sufficiently low noise level in the simulated image was 200 for a 128 3 voxel matrix. With this number of emitted photons/voxel, the MC-based OSEM reconstruction with ten subsets was performed within 20 s/iteration. The images converged after around six iterations. Therefore, the reconstruction time was around 3 min. The activity recovery for the spheres in the Jaszczak phantom was clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.
NASA Astrophysics Data System (ADS)
Spezi, Emiliano; Leal, Antonio
2013-04-01
The Third European Workshop on Monte Carlo Treatment Planning (MCTP2012) was held from 15-18 May, 2012 in Seville, Spain. The event was organized by the Universidad de Sevilla with the support of the European Workgroup on Monte Carlo Treatment Planning (EWG-MCTP). MCTP2012 followed two successful meetings, one held in Ghent (Belgium) in 2006 (Reynaert 2007) and one in Cardiff (UK) in 2009 (Spezi 2010). The recurrence of these workshops together with successful events held in parallel by McGill University in Montreal (Seuntjens et al 2012), show consolidated interest from the scientific community in Monte Carlo (MC) treatment planning. The workshop was attended by a total of 90 participants, mainly coming from a medical physics background. A total of 48 oral presentations and 15 posters were delivered in specific scientific sessions including dosimetry, code development, imaging, modelling of photon and electron radiation transport, external beam radiation therapy, nuclear medicine, brachitherapy and hadrontherapy. A copy of the programme is available on the workshop's website (www.mctp2012.com). In this special section of Physics in Medicine and Biology we report six papers that were selected following the journal's rigorous peer review procedure. These papers actually provide a good cross section of the areas of application of MC in treatment planning that were discussed at MCTP2012. Czarnecki and Zink (2013) and Wagner et al (2013) present the results of their work in small field dosimetry. Czarnecki and Zink (2013) studied field size and detector dependent correction factors for diodes and ion chambers within a clinical 6MV photon beam generated by a Siemens linear accelerator. Their modelling work based on the BEAMnrc/EGSnrc codes and experimental measurements revealed that unshielded diodes were the best choice for small field dosimetry because of their independence from the electron beam spot size and correction factor close to unity. Wagner et al (2013) investigated the recombination effect on liquid ionization chambers for stereotactic radiotherapy, a field of increasing importance in external beam radiotherapy. They modelled both radiation source (Cyberknife unit) and detector with the BEAMnrc/EGSnrc codes and quantified the dependence of the response of this type of detectors on factors such as the volume effect and the electrode. They also recommended that these dependences be accounted for in measurements involving small fields. In the field of external beam radiotherapy, Chakarova et al (2013) showed how total body irradiation (TBI) could be improved by simulating patient treatments with MC. In particular, BEAMnrc/EGSnrc based simulations highlighted the importance of optimizing individual compensators for TBI treatments. In the same area of application, Mairani et al (2013) reported on a new tool for treatment planning in proton therapy based on the FLUKA MC code. The software, used to model both proton therapy beam and patient anatomy, supports single-field and multiple-field optimization and can be used to optimize physical and relative biological effectiveness (RBE)-weighted dose distribution, using both constant and variable RBE models. In the field of nuclear medicine Marcatili et al (2013) presented RAYDOSE, a Geant4-based code specifically developed for applications in molecular radiotherapy (MRT). RAYDOSE has been designed to work in MRT trials using sequential positron emission tomography (PET) or single-photon emission tomography (SPECT) imaging to model patient specific time-dependent metabolic uptake and to calculate the total 3D dose distribution. The code was validated through experimental measurements in homogeneous and heterogeneous phantoms. Finally, in the field of code development Miras et al (2013) reported on CloudMC, a Windows Azure-based application for the parallelization of MC calculations in a dynamic cluster environment. Although the performance of CloudMC has been tested with the PENELOPE MC code, the authors report that software has been designed in a way that it should be independent of the type of MC code, provided that simulation meets a number of operational criteria. We wish to thank Elekta/CMS Inc., the University of Seville, the Junta of Andalusia and the European Regional Development Fund for their financial support. We would like also to acknowledge the members of EWG-MCTP for their help in peer-reviewing all the abstracts, and all the invited speakers who kindly agreed to deliver keynote presentations in their area of expertise. A final word of thanks to our colleagues who worked on the reviewing process of the papers selected for this special section and to the IOP Publishing staff who made it possible. MCTP2012 was accredited by the European Federation of Organisations for Medical Physics as a CPD event for medical physicists. Emiliano Spezi and Antonio Leal Guest Editors References Chakarova R, Müntzing K, Krantz M, E Hedin E and Hertzman S 2013 Monte Carlo optimization of total body irradiation in a phantom and patient geometry Phys. Med. Biol. 58 2461-69 Czarnecki D and Zink K 2013 Monte Carlo calculated correction factors for diodes and ion chambers in small photon fields Phys. Med. Biol. 58 2431-44 Mairani A, Böhlen T T, Schiavi A, Tessonnier T, Molinelli S, Brons S, Battistoni G, Parodi K and Patera V 2013 A Monte Carlo-based treatment planning tool for proton therapy Phys. Med. Biol. 58 2471-90 Marcatili S, Pettinato C, Daniels S, Lewis G, Edwards P, Fanti S and Spezi E 2013 Development and validation of RAYDOSE: a Geant4 based application for molecular radiotherapy Phys. Med. Biol. 58 2491-508 Miras H, Jiménez R, Miras C and Gomà C 2013 CloudMC: A cloud computing application for Monte Carlo simulation Phys. Med. Biol. 58 N125-33 Reynaert N 2007 First European Workshop on Monte Carlo Treatment Planning J. Phys.: Conf. Ser. 74 011001 Seuntjens J, Beaulieu L, El Naqa I and Després P 2012 Special section: Selected papers from the Fourth International Workshop on Recent Advances in Monte Carlo Techniques for Radiation Therapy Phys. Med. Biol. 57 (11) E01 Spezi E 2010 Special section: Selected papers from the Second European Workshop on Monte Carlo Treatment Planning (MCTP2009) Phys. Med. Biol. 55 (16) E01 Wagner A, Crop F, Lacornerie T, Vandevelde F and Reynaert N 2013 Use of a liquid ionization chamber for stereotactic radiotherapy dosimetry Phys. Med. Biol. 58 2445-59
2011-12-01
REMD while reproducing the energy landscape of explicit solvent simulations . ’ INTRODUCTION Molecular dynamics (MD) simulations of proteins can pro...Mongan, J.; McCammon, J. A. Accelerated molecular dynamics : a promising and efficient simulation method for biomolecules. J. Chem. Phys. 2004, 120 (24...Chemical Theory and Computation ARTICLE (8) Abraham,M. J.; Gready, J. E. Ensuringmixing efficiency of replica- exchange molecular dynamics simulations . J
Introducing MCgrid 2.0: Projecting cross section calculations on grids
NASA Astrophysics Data System (ADS)
Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen
2015-11-01
MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.
Peterson, Lars E; Blackburn, Brenna; Phillips, Robert L; Mainous, Arch G
2014-04-01
Participation in Maintenance of Certification for Family Physicians (MC-FP) is now a requirement for residents to take the American Board of Family Medicine (ABFM) certification examination. The objective of this study was to determine baseline use of MC-FP products prior to this requirement and assess how family medicine residency program directors (FMPD) intended to integrate MC-FP into residency education. We used the CERA platform to survey FMPDs. In addition to the core CERA demographic questions, we asked about the use of MC-FP in residency, how FMPDs intended to incorporate MC-FP, and how useful they believe MC-FP will be for resident evaluation. Additionally, we compared select results with the ABFM administrative database. A total of 224 FMPDs responded, for a 50.6% response rate. There was agreement between CERA and ABFM data on the percentage of residencies already using Part 4 modules (39.3% versus 38.8%) but not Part 2 modules (24.7% versus 62.8%). Group MC-FP activities were the preferred method for both Part 2 (45.0%) and Part 4 (54.4%). Most FMPDs agreed that MC-FP will be effective in teaching quality improvement and assessing competencies. Respondents from dually accredited programs were more likely to have used Part 4, but not Part 2, activities prior to 2012. Prior to MC-FP becoming a requirement in residency, a sizeable minority of residencies were already using these tools for education. Assessment of competencies will be crucial in the Next Accreditation System, and MC-FP may help in tracking clinical development over a physician's career.
Mathematical model of organic substrate degradation in solid waste windrow composting.
Seng, Bunrith; Kristanti, Risky Ayu; Hadibarata, Tony; Hirayama, Kimiaki; Katayama-Hirayama, Keiko; Kaneko, Hidehiro
2016-01-01
Organic solid waste composting is a complex process that involves many coupled physical, chemical and biological mechanisms. To understand this complexity and to ease in planning, design and management of the composting plant, mathematical model for simulation is usually applied. The aim of this paper is to develop a mathematical model of organic substrate degradation and its performance evaluation in solid waste windrow composting system. The present model is a biomass-dependent model, considering biological growth processes under the limitation of moisture, oxygen and substrate contents, and temperature. The main output of this model is substrate content which was divided into two categories: slowly and rapidly degradable substrates. To validate the model, it was applied to a laboratory scale windrow composting of a mixture of wood chips and dog food. The wastes were filled into a cylindrical reactor of 6 cm diameter and 1 m height. The simulation program was run for 3 weeks with 1 s stepwise. The simulated results were in reasonably good agreement with the experimental results. The MC and temperature of model simulation were found to be matched with those of experiment, but limited for rapidly degradable substrates. Under anaerobic zone, the degradation of rapidly degradable substrate needs to be incorporated into the model to achieve full simulation of a long period static pile composting. This model is a useful tool to estimate the changes of substrate content during composting period, and acts as a basic model for further development of a sophisticated model.
NASA Astrophysics Data System (ADS)
Costa, Filipa; Doran, Simon J.; Hanson, Ian M.; Nill, Simeon; Billas, Ilias; Shipley, David; Duane, Simon; Adamovics, John; Oelfke, Uwe
2018-03-01
Dosimetric quality assurance (QA) of the new Elekta Unity (MR-linac) will differ from the QA performed of a conventional linac due to the constant magnetic field, which creates an electron return effect (ERE). In this work we aim to validate PRESAGE® dosimetry in a transverse magnetic field, and assess its use to validate the research version of the Monaco TPS of the MR-linac. Cylindrical samples of PRESAGE® 3D dosimeter separated by an air gap were irradiated with a cobalt-60 unit, while placed between the poles of an electromagnet at 0.5 T and 1.5 T. This set-up was simulated in EGSnrc/Cavity Monte Carlo (MC) code and relative dose distributions were compared with measurements using 1D and 2D gamma criteria of 3% and 1.5 mm. The irradiation conditions were adapted for the MR-linac and compared with Monaco TPS simulations. Measured and EGSnrc/Cavity simulated profiles showed good agreement with a gamma passing rate of 99.9% for 0.5 T and 99.8% for 1.5 T. Measurements on the MR-linac also compared well with Monaco TPS simulations, with a gamma passing rate of 98.4% at 1.5 T. Results demonstrated that PRESAGE® can accurately measure dose and detect the ERE, encouraging its use as a QA tool to validate the Monaco TPS of the MR-linac for clinically relevant dose distributions at tissue-air boundaries.
STS-92 Mission Specialist McArthur has his launch and entry suit adjusted
NASA Technical Reports Server (NTRS)
2000-01-01
In the Operations and Checkout Building, STS-92 Mission Specialist William S. McArthur Jr. has the gloves on his launch and entry suit adjusted during fit check. McArthur and the rest of the crew are at KSC for Terminal Countdown Demonstration Test activities. The TCDT provides emergency egress training, simulated countdown exercises and opportunities to inspect the mission payload. This mission will be McArthur's third Shuttle flight. STS-92 is scheduled to launch Oct. 5 at 9:38 p.m. EDT from Launch Pad 39A on the fifth flight to the International Space Station. It will carry two elements of the Space Station, the Integrated Truss Structure Z1 and the third Pressurized Mating Adapter. The mission is also the 100th flight in the Shuttle program.
STS-92 Mission Specialist McArthur has his launch and entry suit adjusted
NASA Technical Reports Server (NTRS)
2000-01-01
During pre-pack and fit check in the Operations and Checkout Building, STS-92 Mission Specialist William S. McArthur Jr. uses a laptop computer while garbed in his full launch and entry suit. McArthur and the rest of the crew are at KSC for Terminal Countdown Demonstration Test activities. The TCDT provides emergency egress training, simulated countdown exercises and opportunities to inspect the mission payload. This mission will be McArthur's third Shuttle flight. STS-92 is scheduled to launch Oct. 5 at 9:38 p.m. EDT from Launch Pad 39A on the fifth flight to the International Space Station. It will carry two elements of the Space Station, the Integrated Truss Structure Z1 and the third Pressurized Mating Adapter. The mission is also the 100th flight in the Shuttle program.
A Study of Neutron Leakage in Finite Objects
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
A computationally efficient 3DHZETRN code capable of simulating High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation was recently developed for simple shielded objects. Monte Carlo (MC) benchmarks were used to verify the 3DHZETRN methodology in slab and spherical geometry, and it was shown that 3DHZETRN agrees with MC codes to the degree that various MC codes agree among themselves. One limitation in the verification process is that all of the codes (3DHZETRN and three MC codes) utilize different nuclear models/databases. In the present report, the new algorithm, with well-defined convergence criteria, is used to quantify the neutron leakage from simple geometries to provide means of verifying 3D effects and to provide guidance for further code development.
NASA Astrophysics Data System (ADS)
Amenomori, M.; Bi, X. J.; Chen, D.; Chen, T. L.; Chen, W. Y.; Cui, S. W.; Danzengluobu; Ding, L. K.; Feng, C. F.; Feng, Zhaoyang; Feng, Z. Y.; Gou, Q. B.; Guo, Y. Q.; He, H. H.; He, Z. T.; Hibino, K.; Hotta, N.; Hu, Haibing; Hu, H. B.; Huang, J.; Jia, H. Y.; Jiang, L.; Kajino, F.; Kasahara, K.; Katayose, Y.; Kato, C.; Kawata, K.; Kozai, M.; Labaciren; Le, G. M.; Li, A. F.; Li, H. J.; Li, W. J.; Liu, C.; Liu, J. S.; Liu, M. Y.; Lu, H.; Meng, X. R.; Miyazaki, T.; Munakata, K.; Nakajima, T.; Nakamura, Y.; Nanjo, H.; Nishizawa, M.; Niwa, T.; Ohnishi, M.; Ohta, I.; Ozawa, S.; Qian, X. L.; Qu, X. B.; Saito, T.; Saito, T. Y.; Sakata, M.; Sako, T. K.; Shao, J.; Shibata, M.; Shiomi, A.; Shirai, T.; Sugimoto, H.; Takita, M.; Tan, Y. H.; Tateyama, N.; Torii, S.; Tsuchiya, H.; Udo, S.; Wang, H.; Wu, H. R.; Xue, L.; Yamamoto, Y.; Yamauchi, K.; Yang, Z.; Yuan, A. F.; Zhai, L. M.; Zhang, H. M.; Zhang, J. L.; Zhang, X. Y.; Zhang, Y.; Zhang, Yi; Zhang, Ying; Zhaxisangzhu; Zhou, X. X.; Tibet ASγ Collaboration
2018-06-01
We examine the possible influence of Earth-directed coronal mass ejections (ECMEs) on the Sun’s shadow in the 3 TeV cosmic-ray intensity observed by the Tibet-III air shower (AS) array. We confirm a clear solar-cycle variation of the intensity deficit in the Sun’s shadow during ten years between 2000 and 2009. This solar-cycle variation is overall reproduced by our Monte Carlo (MC) simulations of the Sun’s shadow based on the potential field model of the solar magnetic field averaged over each solar rotation period. We find, however, that the magnitude of the observed intensity deficit in the Sun’s shadow is significantly less than that predicted by MC simulations, particularly during the period around solar maximum when a significant number of ECMEs is recorded. The χ 2 tests of the agreement between the observations and the MC simulations show that the difference is larger during the periods when the ECMEs occur, and the difference is reduced if the periods of ECMEs are excluded from the analysis. This suggests the first experimental evidence of the ECMEs affecting the Sun’s shadow observed in the 3 TeV cosmic-ray intensity.
Supernova Driving. II. Compressive Ratio in Molecular-cloud Turbulence
NASA Astrophysics Data System (ADS)
Pan, Liubin; Padoan, Paolo; Haugbølle, Troels; Nordlund, Åke
2016-07-01
The compressibility of molecular cloud (MC) turbulence plays a crucial role in star formation models, because it controls the amplitude and distribution of density fluctuations. The relation between the compressive ratio (the ratio of powers in compressive and solenoidal motions) and the statistics of turbulence has been previously studied systematically only in idealized simulations with random external forces. In this work, we analyze a simulation of large-scale turbulence (250 pc) driven by supernova (SN) explosions that has been shown to yield realistic MC properties. We demonstrate that SN driving results in MC turbulence with a broad lognormal distribution of the compressive ratio, with a mean value ≈0.3, lower than the equilibrium value of ≈0.5 found in the inertial range of isothermal simulations with random solenoidal driving. We also find that the compressibility of the turbulence is not noticeably affected by gravity, nor are the mean cloud radial (expansion or contraction) and solid-body rotation velocities. Furthermore, the clouds follow a general relation between the rms density and the rms Mach number similar to that of supersonic isothermal turbulence, though with a large scatter, and their average gas density probability density function is described well by a lognormal distribution, with the addition of a high-density power-law tail when self-gravity is included.
NASA Astrophysics Data System (ADS)
Wang, Xin; Utsumi, Motoo; Yang, Yingnan; Li, Dawei; Zhao, Yingxin; Zhang, Zhenya; Feng, Chuanping; Sugiura, Norio; Cheng, Jay Jiayang
2015-01-01
A novel photocatalyst AgBr/Ag3PO4/TiO2 was developed by a simple facile in situ deposition method and used for degradation of mirocystin-LR. TiO2 (P25) as a cost effective chemical was used to improve the stability of AgBr/Ag3PO4 under simulated solar light irradiation. The photocatalytic activity tests for this heterojunction were conducted under simulated solar light irradiation using methyl orange as targeted pollutant. The results indicated that the optimal Ag to Ti molar ratio for the photocatalytic activity of the resulting heterojunction AgBr/Ag3PO4/TiO2 was 1.5 (named as 1.5 BrPTi), which possessed higher photocatalytic capacity than AgBr/Ag3PO4. The 1.5 BrPTi heterojunction was also more stable than AgBr/Ag3PO4 in photocatalysis. This highly efficient and relatively stable photocatalyst was further tested for degradation of the hepatotoxin microcystin-LR (MC-LR). The results suggested that MC-LR was much more easily degraded by 1.5 BrPTi than by AgBr/Ag3PO4. The quenching effects of different scavengers proved that reactive h+ and •OH played important roles for MC-LR degradation.
Monitoring System for the GRID Monte Carlo Mass Production in the H1 Experiment at DESY
NASA Astrophysics Data System (ADS)
Bystritskaya, Elena; Fomenko, Alexander; Gogitidze, Nelly; Lobodzinski, Bogdan
2014-06-01
The H1 Virtual Organization (VO), as one of the small VOs, employs most components of the EMI or gLite Middleware. In this framework, a monitoring system is designed for the H1 Experiment to identify and recognize within the GRID the best suitable resources for execution of CPU-time consuming Monte Carlo (MC) simulation tasks (jobs). Monitored resources are Computer Elements (CEs), Storage Elements (SEs), WMS-servers (WMSs), CernVM File System (CVMFS) available to the VO HONE and local GRID User Interfaces (UIs). The general principle of monitoring GRID elements is based on the execution of short test jobs on different CE queues using submission through various WMSs and directly to the CREAM-CEs as well. Real H1 MC Production jobs with a small number of events are used to perform the tests. Test jobs are periodically submitted into GRID queues, the status of these jobs is checked, output files of completed jobs are retrieved, the result of each job is analyzed and the waiting time and run time are derived. Using this information, the status of the GRID elements is estimated and the most suitable ones are included in the automatically generated configuration files for use in the H1 MC production. The monitoring system allows for identification of problems in the GRID sites and promptly reacts on it (for example by sending GGUS (Global Grid User Support) trouble tickets). The system can easily be adapted to identify the optimal resources for tasks other than MC production, simply by changing to the relevant test jobs. The monitoring system is written mostly in Python and Perl with insertion of a few shell scripts. In addition to the test monitoring system we use information from real production jobs to monitor the availability and quality of the GRID resources. The monitoring tools register the number of job resubmissions, the percentage of failed and finished jobs relative to all jobs on the CEs and determine the average values of waiting and running time for the involved GRID queues. CEs which do not meet the set criteria can be removed from the production chain by including them in an exception table. All of these monitoring actions lead to a more reliable and faster execution of MC requests.
NASA Astrophysics Data System (ADS)
Atmatzidou, Soumela; Demetriadis, Stavros; Nika, Panagiota
2018-02-01
Educational robotics (ER) is an innovative learning tool that offers students opportunities to develop higher-order thinking skills. This study investigates the development of students' metacognitive (MC) and problem-solving (PS) skills in the context of ER activities, implementing different modes of guidance in two student groups (11-12 years old, N1 = 30, and 15-16 years old, N2 = 22). The students of each age group were involved in an 18-h group-based activity after being randomly distributed in two conditions: "minimal" (with minimal MC and PS guidance) and "strong" (with strong MC and PS guidance). Evaluations were based on the Metacognitive Awareness Inventory measuring students' metacognitive awareness and on a think-aloud protocol asking students to describe the process they would follow to solve a certain robot-programming task. The results suggest that (a) strong guidance in solving problems can have a positive impact on students' MC and PS skills and (b) students reach eventually the same level of MC and PS skills development independently of their age and gender.
Ganczak, Maria; Korzeń, Marcin; Olszewski, Maciej
2017-09-21
Objective : To evaluate the beliefs of medical university students regarding male circumcision (MC), as well as attitudes and the predictors of its promotion in the case of adults at risk of HIV. Methods: A cross-sectional survey was conducted between 2013-2016 at the Medical University in Szczecin, Poland, among final year Polish/foreign students from Northern Europe, using a standardized questionnaire. Results: There were 539 participants, median age 25 years, 40.8% males, and 66.8% were Polish nationals. The MC rate was 16.7%. Regarding HIV/AIDS knowledge, 66.6% of the students scored more than 75%; and, 34.2% knew that MC reduces the risk of HIV infection. One in eleven respondents (9.1%) believed that circumcised men felt more intense sexual pleasure. More than half of the respondents (54.8%) declared that they would recommend MC to adult patients at risk for HIV. The belief that circumcised men felt more intense sexual pleasure, and knowledge on MC regarding HIV risk reduction was associated with greater odds of recommending adult MC (OR = 3.35 and OR = 2.13, respectively). Conclusions : Poor knowledge of its benefits and a low willingness to promote the procedure-strongly dependent on personal beliefs-suggest that medical students may need additional training to help them to discuss MC more openly with adult men at risk for HIV infection. Knowledge may be an effective tool when making decisions regarding MC promotion.
Ganczak, Maria; Olszewski, Maciej
2017-01-01
Objective: To evaluate the beliefs of medical university students regarding male circumcision (MC), as well as attitudes and the predictors of its promotion in the case of adults at risk of HIV. Methods: A cross-sectional survey was conducted between 2013–2016 at the Medical University in Szczecin, Poland, among final year Polish/foreign students from Northern Europe, using a standardized questionnaire. Results: There were 539 participants, median age 25 years, 40.8% males, and 66.8% were Polish nationals. The MC rate was 16.7%. Regarding HIV/AIDS knowledge, 66.6% of the students scored more than 75%; and, 34.2% knew that MC reduces the risk of HIV infection. One in eleven respondents (9.1%) believed that circumcised men felt more intense sexual pleasure. More than half of the respondents (54.8%) declared that they would recommend MC to adult patients at risk for HIV. The belief that circumcised men felt more intense sexual pleasure, and knowledge on MC regarding HIV risk reduction was associated with greater odds of recommending adult MC (OR = 3.35 and OR = 2.13, respectively). Conclusions: Poor knowledge of its benefits and a low willingness to promote the procedure—strongly dependent on personal beliefs—suggest that medical students may need additional training to help them to discuss MC more openly with adult men at risk for HIV infection. Knowledge may be an effective tool when making decisions regarding MC promotion. PMID:28934174
Science Fiction: Popular Culture as Reading and Learning Motivation.
ERIC Educational Resources Information Center
Ontell, Val
Tools for teaching students how to question intelligently are badly needed. Science fiction provides many such tools in a variety of subjects by stimulating the imagination and thus motivating students to learn. Such vehicles are available at all grade levels. From Mark Twain and H.G. Wells to Anne McCaffrey and Isaac Asimov, novels and short…