Sample records for purpose cellular monte

  1. Simulation of Changes in Diffusion Related to Different Pathologies at Cellular Level After Traumatic Brain Injury

    PubMed Central

    Lin, Mu; He, Hongjian; Schifitto, Giovanni; Zhong, Jianhui

    2016-01-01

    Purpose The goal of the current study was to investigate tissue pathology at the cellular level in traumatic brain injury (TBI) as revealed by Monte Carlo simulation of diffusion tensor imaging (DTI)-derived parameters and elucidate the possible sources of conflicting findings of DTI abnormalities as reported in the TBI literature. Methods A model with three compartments separated by permeable membranes was employed to represent the diffusion environment of water molecules in brain white matter. The dynamic diffusion process was simulated with a Monte Carlo method using adjustable parameters of intra-axonal diffusivity, axon separation, glial cell volume fraction, and myelin sheath permeability. The effects of tissue pathology on DTI parameters were investigated by adjusting the parameters of the model corresponding to different stages of brain injury. Results The results suggest that the model is appropriate and the DTI-derived parameters simulate the predominant cellular pathology after TBI. Our results further indicate that when edema is not prevalent, axial and radial diffusivity have better sensitivity to axonal injury and demyelination than other DTI parameters. Conclusion DTI is a promising biomarker to detect and stage tissue injury after TBI. The observed inconsistencies among previous studies are likely due to scanning at different stages of tissue injury after TBI. PMID:26256558

  2. WE-EF-BRA-02: A Monte Carlo Study of Macroscopic and Microscopic Dose Descriptors for Kilovoltage Cellular Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliver, P; Thomson, R

    2015-06-15

    Purpose: To investigate how doses to cellular (microscopic) targets depend on cell morphology, and how cellular doses relate to doses to bulk tissues and water for 20 to 370 keV photon sources using Monte Carlo (MC) simulations. Methods: Simulation geometries involve cell clusters, single cells, and single nuclear cavities embedded in various healthy and cancerous bulk tissue phantoms. A variety of nucleus and cytoplasm elemental compositions are investigated. Cell and nucleus radii range from 5 to 10 microns and 2 to 9 microns, respectively. Doses to water and bulk tissue cavities are compared to nucleus and cytoplasm doses. Results: Variationsmore » in cell dose with simulation geometry are most pronounced for lower energy sources. Nuclear doses are sensitive to the surrounding geometry: the nuclear dose in a multicell model differs from the dose to a cavity of nuclear medium in an otherwise homogeneous bulk tissue phantom by more than 7% at 20 keV. Nuclear doses vary with cell size by up to 20% at 20 keV, with 10% differences persisting up to 90 keV. Bulk tissue and water cavity doses differ from cellular doses by up to 16%. MC results are compared to cavity theory predictions; large and small cavity theories qualitatively predict nuclear doses for energies below and above 50 keV, respectively. Burlin’s (1969) intermediate cavity theory best predicts MC results with an average discrepancy of 4%. Conclusion: Cellular doses vary as a function of source energy, subcellular compartment size, elemental composition, and tissue morphology. Neither water nor bulk tissue is an appropriate surrogate for subcellular targets in radiation dosimetry. The influence of microscopic inhomogeneities in the surrounding environment on the nuclear dose and the importance of the nucleus as a target for radiation-induced cell death emphasizes the potential importance of cellular dosimetry for understanding radiation effects. Funded by the Natural Sciences and Engineering Research Council of Canada (NSERC), the Canada Research Chairs Program (CRC), and the Ontario Ministry of Training, Colleges and Universities.« less

  3. Particle-Based Simulations of Microscopic Thermal Properties of Confined Systems

    DTIC Science & Technology

    2014-11-01

    velocity versus electric field in gallium arsenide (GaAs) computed with the original CMC table structure (squares) at temperature T=150K, and the new...computer-aided design Cellular Monte Carlo Ensemble Monte Carlo gallium arsenide Heat Transport Equation DARPA Defense Advanced Research Projects

  4. Quantifying white matter tract diffusion parameters in the presence of increased extra-fiber cellularity and vasogenic edema

    PubMed Central

    Chiang, Chia-Wen; Wang, Yong; Sun, Peng; Lin, Tsen-Hsuan; Trinkaus, Kathryn; Cross, Anne H.; Song, Sheng-Kwei

    2014-01-01

    The effect of extra-fiber structural and pathological components confounding diffusion tensor imaging (DTI) computation was quantitatively investigated using data generated by both Monte-Carlo simulations and tissue phantoms. Increased extent of vasogenic edema, by addition of various amount of gel to fixed normal mouse trigeminal nerves or by increasing non-restricted isotropic diffusion tensor components in Monte-Carlo simulations, significantly decreased fractional anisotropy (FA), increased radial diffusivity, while less significantly increased axial diffusivity derived by DTI. Increased cellularity, mimicked by graded increase of the restricted isotropic diffusion tensor component in Monte-Carlo simulations, significantly decreased FA and axial diffusivity with limited impact on radial diffusivity derived by DTI. The MC simulation and tissue phantom data were also analyzed by the recently developed diffusion basis spectrum imaging (DBSI) to simultaneously distinguish and quantify the axon/myelin integrity and extra-fiber diffusion components. Results showed that increased cellularity or vasogenic edema did not affect the DBSI-derived fiber FA, axial or radial diffusivity. Importantly, the extent of extra-fiber cellularity and edema estimated by DBSI correlated with experimentally added gel and Monte-Carlo simulations. We also examined the feasibility of applying 25-direction diffusion encoding scheme for DBSI analysis on coherent white matter tracts. Results from both phantom experiments and simulations suggested that the 25-direction diffusion scheme provided comparable DBSI estimation of both fiber diffusion parameters and extra-fiber cellularity/edema extent as those by 99-direction scheme. An in vivo 25-direction DBSI analysis was performed on experimental autoimmune encephalomyelitis (EAE, an animal model of human multiple sclerosis) optic nerve as an example to examine the validity of derived DBSI parameters with post-imaging immunohistochemistry verification. Results support that in vivo DBSI using 25-direction diffusion scheme correctly reflect the underlying axonal injury, demyelination, and inflammation of optic nerves in EAE mice. PMID:25017446

  5. Modeling 2D and 3D diffusion.

    PubMed

    Saxton, Michael J

    2007-01-01

    Modeling obstructed diffusion is essential to the understanding of diffusion-mediated processes in the crowded cellular environment. Simple Monte Carlo techniques for modeling obstructed random walks are explained and related to Brownian dynamics and more complicated Monte Carlo methods. Random number generation is reviewed in the context of random walk simulations. Programming techniques and event-driven algorithms are discussed as ways to speed simulations.

  6. A cellular automaton model of wildfire propagation and extinction

    Treesearch

    Keith C. Clarke; James A. Brass; Phillip J. Riggan

    1994-01-01

    We propose a new model to predict the spatial and temporal behavior of wildfires. Fire spread and intensity were simulated using a cellular automaton model. Monte Carlo techniques were used to provide fire risk probabilities for areas where fuel loadings and topography are known. The model assumes predetermined or measurable environmental variables such as wind...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, J; Park, S; Jeong, J

    Purpose: In particle therapy and radiobiology, the investigation of mechanisms leading to the death of target cancer cells induced by ionising radiation is an active field of research. Recently, several studies based on Monte Carlo simulation codes have been initiated in order to simulate physical interactions of ionising particles at cellular scale and in DNA. Geant4-DNA is the one of them; it is an extension of the general purpose Geant4 Monte Carlo simulation toolkit for the simulation of physical interactions at sub-micrometre scale. In this study, we present Geant4-DNA Monte Carlo simulations for the prediction of DNA strand breakage usingmore » a geometrical modelling of DNA structure. Methods: For the simulation of DNA strand breakage, we developed a specific DNA geometrical structure. This structure consists of DNA components, such as the deoxynucleotide pairs, the DNA double helix, the nucleosomes and the chromatin fibre. Each component is made of water because the cross sections models currently available in Geant4-DNA for protons apply to liquid water only. Also, at the macroscopic-scale, protons were generated with various energies available for proton therapy at the National Cancer Center, obtained using validated proton beam simulations developed in previous studies. These multi-scale simulations were combined for the validation of Geant4-DNA in radiobiology. Results: In the double helix structure, the deposited energy in a strand allowed to determine direct DNA damage from physical interaction. In other words, the amount of dose and frequency of damage in microscopic geometries was related to direct radiobiological effect. Conclusion: In this report, we calculated the frequency of DNA strand breakage using Geant4- DNA physics processes for liquid water. This study is now on-going in order to develop geometries which use realistic DNA material, instead of liquid water. This will be tested as soon as cross sections for DNA material become available in Geant4-DNA.« less

  8. Monte Carlo calculations of the cellular S-values for α-particle-emitting radionuclides incorporated into the nuclei of cancer cells of the MDA-MB231, MCF7 and PC3 lines.

    PubMed

    Rojas-Calderón, E L; Ávila, O; Ferro-Flores, G

    2018-05-01

    S-values (dose per unit of cumulated activity) for alpha particle-emitting radionuclides and monoenergetic alpha sources placed in the nuclei of three cancer cell models (MCF7, MDA-MB231 breast cancer cells and PC3 prostate cancer cells) were obtained by Monte Carlo simulation. The MCNPX code was used to calculate the fraction of energy deposited in the subcellular compartments due to the alpha sources in order to obtain the S-values. A comparison with internationally accepted S-values reported by the MIRD Cellular Committee for alpha sources in three sizes of spherical cells was also performed leading to an agreement within 4% when an alpha extended source uniformly distributed in the nucleus is simulated. This result allowed to apply the Monte Carlo Methodology to evaluate S-values for alpha particles in cancer cells. The calculation of S-values for nucleus, cytoplasm and membrane of cancer cells considering their particular geometry, distribution of the radionuclide source and chemical composition by means of Monte Carlo simulation provides a good approach for dosimetry assessment of alpha emitters inside cancer cells. Results from this work provide information and tools that may help researchers in the selection of appropriate radiopharmaceuticals in alpha-targeted cancer therapy and improve its dosimetry evaluation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. A hybrid parallel framework for the cellular Potts model simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approachmore » achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).« less

  10. Monte Carlo simulation of proton track structure in biological matter

    DOE PAGES

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.; ...

    2017-05-25

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  11. Monte Carlo simulation of proton track structure in biological matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinto, Michele A.; Monti, Juan M.; Weck, Philippe F.

    Here, understanding the radiation-induced effects at the cellular and subcellular levels remains crucial for predicting the evolution of irradiated biological matter. In this context, Monte Carlo track-structure simulations have rapidly emerged among the most suitable and powerful tools. However, most existing Monte Carlo track-structure codes rely heavily on the use of semi-empirical cross sections as well as water as a surrogate for biological matter. In the current work, we report on the up-to-date version of our homemade Monte Carlo code TILDA-V – devoted to the modeling of the slowing-down of 10 keV–100 MeV protons in both water and DNA –more » where the main collisional processes are described by means of an extensive set of ab initio differential and total cross sections.« less

  12. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  13. Monte Carlo simulation of a simple gene network yields new evolutionary insights.

    PubMed

    Andrecut, M; Cloud, D; Kauffman, S A

    2008-02-07

    Monte Carlo simulations of a genetic toggle switch show that its behavior can be more complex than analytic models would suggest. We show here that as a result of the interplay between frequent and infrequent reaction events, such a switch can have more stable states than an analytic model would predict, and that the number and character of these states depend to a large extent on the propensity of transcription factors to bind to and dissociate from promoters. The effects of gene duplications differ even more; in analytic models, these seem to result in the disappearance of bi-stability and thus a loss of the switching function, but a Monte Carlo simulation shows that they can result in the appearance of new stable states without the loss of old ones, and thus in an increase of the complexity of the switch's behavior which may facilitate the evolution of new cellular functions. These differences are of interest with respect to the evolution of gene networks, particularly in clonal lines of cancer cells, where the duplication of active genes is an extremely common event, and often seems to result in the appearance of viable new cellular phenotypes.

  14. Cellular dosimetry calculations for Strontium-90 using Monte Carlo code PENELOPE.

    PubMed

    Hocine, Nora; Farlay, Delphine; Boivin, Georges; Franck, Didier; Agarande, Michelle

    2014-11-01

    To improve risk assessments associated with chronic exposure to Strontium-90 (Sr-90), for both the environment and human health, it is necessary to know the energy distribution in specific cells or tissue. Monte Carlo (MC) simulation codes are extremely useful tools for calculating deposition energy. The present work was focused on the validation of the MC code PENetration and Energy LOss of Positrons and Electrons (PENELOPE) and the assessment of dose distribution to bone marrow cells from punctual Sr-90 source localized within the cortical bone part. S-values (absorbed dose per unit cumulated activity) calculations using Monte Carlo simulations were performed by using PENELOPE and Monte Carlo N-Particle eXtended (MCNPX). Cytoplasm, nucleus, cell surface, mouse femur bone and Sr-90 radiation source were simulated. Cells are assumed to be spherical with the radii of the cell and cell nucleus ranging from 2-10 μm. The Sr-90 source is assumed to be uniformly distributed in cell nucleus, cytoplasm and cell surface. The comparison of S-values calculated with PENELOPE to MCNPX results and the Medical Internal Radiation Dose (MIRD) values agreed very well since the relative deviations were less than 4.5%. The dose distribution to mouse bone marrow cells showed that the cells localized near the cortical part received the maximum dose. The MC code PENELOPE may prove useful for cellular dosimetry involving radiation transport through materials other than water, or for complex distributions of radionuclides and geometries.

  15. SU-E-T-667: Radiosensitization Due to Gold Nanoparticles: A Monte Carlo Cellular Dosimetry Investigation of An Expansive Parameter Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinov, M; Thomson, R

    2015-06-15

    Purpose: To investigate dose enhancement to cellular compartments following gold nanoparticle (GNP) uptake in tissue, varying cell and tissue morphology, intra and extracellular GNP distribution, and source energy using Monte Carlo (MC) simulations. Methods: Models of single and multiple cells are developed for normal and cancerous tissues; cells (outer radii 5–10 µm) are modeled as concentric spheres comprising the nucleus (radii 2.5–7.5 µm) and cytoplasm. GNP distributions modeled include homogeneous distributions throughout the cytoplasm, variable numbers of GNP-containing endosomes within the cytoplasm, or distributed in a spherical shell about the nucleus. Gold concentrations range from 1 to 30 mg/g. Dosemore » to nucleus and to cytoplasm for simulations including GNPs are compared to simulations without GNPs to compute Nuclear and Cytoplasm Dose Enhancement Factors (NDEF, CDEF). Photon source energies are between 20 keV and 1.25 MeV. Results: DEFs are highly sensitive to GNP intracellular distribution; for a 2.5 µm radius nucleus irradiated by a 30 keV source, NDEF varies from 1.2 for a single endosome containing all GNPs to 8.2 for GNPs distributed about the nucleus (7 mg/g). DEFs vary with cell dimensions and source energy: NDEFs vary from 2.5 (90 keV) to 8.2 (30 keV) for a 2.5 µm radius nucleus and from 1.1 (90 keV) to 1.3 (30 keV) for a 7.5 µm radius nucleus, both with GNPs in a spherical shell about the nucleus (7 mg/g). NDEF and CDEF are generally different within a single cell. For multicell models, the presence of gold within intervening tissues between source and target perturbs the fluence reaching cellular targets, resulting in DEF inhomogeneities within a population of irradiated cells. Conclusion: DEFs vary by an order of magnitude for different cell models, GNP distributions, and source energies, demonstrating the importance of detailed modelling for advancing GNP development for radiotherapy. Funding provided by the Natural Sciences and Engineering Council of Canada (NSERC), and the Canada Research Chairs Program (CRC)« less

  16. SU-E-T-188: Film Dosimetry Verification of Monte Carlo Generated Electron Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enright, S; Asprinio, A; Lu, L

    2014-06-01

    Purpose: The purpose of this study was to compare dose distributions from film measurements to Monte Carlo generated electron treatment plans. Irradiation with electrons offers the advantages of dose uniformity in the target volume and of minimizing the dose to deeper healthy tissue. Using the Monte Carlo algorithm will improve dose accuracy in regions with heterogeneities and irregular surfaces. Methods: Dose distributions from GafChromic{sup ™} EBT3 films were compared to dose distributions from the Electron Monte Carlo algorithm in the Eclipse{sup ™} radiotherapy treatment planning system. These measurements were obtained for 6MeV, 9MeV and 12MeV electrons at two depths. Allmore » phantoms studied were imported into Eclipse by CT scan. A 1 cm thick solid water template with holes for bonelike and lung-like plugs was used. Different configurations were used with the different plugs inserted into the holes. Configurations with solid-water plugs stacked on top of one another were also used to create an irregular surface. Results: The dose distributions measured from the film agreed with those from the Electron Monte Carlo treatment plan. Accuracy of Electron Monte Carlo algorithm was also compared to that of Pencil Beam. Dose distributions from Monte Carlo had much higher pass rates than distributions from Pencil Beam when compared to the film. The pass rate for Monte Carlo was in the 80%–99% range, where the pass rate for Pencil Beam was as low as 10.76%. Conclusion: The dose distribution from Monte Carlo agreed with the measured dose from the film. When compared to the Pencil Beam algorithm, pass rates for Monte Carlo were much higher. Monte Carlo should be used over Pencil Beam for regions with heterogeneities and irregular surfaces.« less

  17. Diffusional mechanisms augment the fluorine magnetic resonance relaxation in paramagnetic perfluorocarbon nanoparticles that provides a “relaxation switch” for detecting cellular endosomal activation

    PubMed Central

    Hu, Lingzhi; Zhang, Lei; Chen, Junjie; Lanza, Gregory M.; Wickline, Samuel A.

    2011-01-01

    Purpose To develop a physical model for the 19F relaxation enhancement in paramagnetic perfluorocarbon nanoparticles (PFC NP) and demonstrate its application in monitoring cellular endosomal functionality through a “19F relaxation switch” phenomenon. Materials and Methods An explicit expression for 19F longitudinal relaxation enhancement was derived analytically. Monte-Carlo simulation was performed to confirm the gadolinium induced magnetic field inhomogenity inside the PFC NP. Field dependent T1 measurements for three types of paramagnetic PFC NPs were carried out to validate the theoretical prediction. Based on the physical model, 19F and 1H relaxation properties of macrophage internalized paramagnetic PFC NPs were measured to evaluate the intracellular process of NPs by macrophages in vitro. Results The theoretical description was confirmed experimentally by field-dependent T1 measurements. The shortening of 19F T1 was found to be attributed to the Brownian motion of PFC molecules inside the NP in conjunction with their ability to permeate into the lipid surfactant coating. A dramatic change of 19F T1 was observed upon endocytosis, revealing the transition from intact bound PFC NP to processed constituents. Conclusion The proposed first-principle analysis of 19F spins in paramagnetic PFC NP relates their structural parameters to the special MR relaxation features. The demonstrated “19F relaxation switch” phenomenon is potentially useful for monitoring cellular endosomal functionality. PMID:21761488

  18. Kinetic Monte Carlo and cellular particle dynamics simulations of multicellular systems

    NASA Astrophysics Data System (ADS)

    Flenner, Elijah; Janosi, Lorant; Barz, Bogdan; Neagu, Adrian; Forgacs, Gabor; Kosztin, Ioan

    2012-03-01

    Computer modeling of multicellular systems has been a valuable tool for interpreting and guiding in vitro experiments relevant to embryonic morphogenesis, tumor growth, angiogenesis and, lately, structure formation following the printing of cell aggregates as bioink particles. Here we formulate two computer simulation methods: (1) a kinetic Monte Carlo (KMC) and (2) a cellular particle dynamics (CPD) method, which are capable of describing and predicting the shape evolution in time of three-dimensional multicellular systems during their biomechanical relaxation. Our work is motivated by the need of developing quantitative methods for optimizing postprinting structure formation in bioprinting-assisted tissue engineering. The KMC and CPD model parameters are determined and calibrated by using an original computational-theoretical-experimental framework applied to the fusion of two spherical cell aggregates. The two methods are used to predict the (1) formation of a toroidal structure through fusion of spherical aggregates and (2) cell sorting within an aggregate formed by two types of cells with different adhesivities.

  19. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  20. Sample Size and Power Estimates for a Confirmatory Factor Analytic Model in Exercise and Sport: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying

    2011-01-01

    Monte Carlo methods can be used in data analytic situations (e.g., validity studies) to make decisions about sample size and to estimate power. The purpose of using Monte Carlo methods in a validity study is to improve the methodological approach within a study where the primary focus is on construct validity issues and not on advancing…

  1. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.

    Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for allmore » exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.« less

  2. Performance Comparison of Orthogonal and Quasi-orthogonal Codes in Quasi-Synchronous Cellular CDMA Communication

    NASA Astrophysics Data System (ADS)

    Jos, Sujit; Kumar, Preetam; Chakrabarti, Saswat

    Orthogonal and quasi-orthogonal codes are integral part of any DS-CDMA based cellular systems. Orthogonal codes are ideal for use in perfectly synchronous scenario like downlink cellular communication. Quasi-orthogonal codes are preferred over orthogonal codes in the uplink communication where perfect synchronization cannot be achieved. In this paper, we attempt to compare orthogonal and quasi-orthogonal codes in presence of timing synchronization error. This will give insight into the synchronization demands in DS-CDMA systems employing the two classes of sequences. The synchronization error considered is smaller than chip duration. Monte-Carlo simulations have been carried out to verify the analytical and numerical results.

  3. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    NASA Astrophysics Data System (ADS)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  4. Effect of the multiple scattering of electrons in Monte Carlo simulation of LINACS.

    PubMed

    Vilches, Manuel; García-Pareja, Salvador; Guerrero, Rafael; Anguiano, Marta; Lallena, Antonio M

    2008-01-01

    Results obtained from Monte Carlo simulations of the transport of electrons in thin slabs of dense material media and air slabs with different widths are analyzed. Various general purpose Monte Carlo codes have been used: PENELOPE, GEANT3, GEANT4, EGSNRC, MCNPX. Non-negligible differences between the angular and radial distributions after the slabs have been found. The effects of these differences on the depth doses measured in water are also discussed.

  5. Discriminating cellular heterogeneity using microwell-based RNA cytometry

    PubMed Central

    Dimov, Ivan K.; Lu, Rong; Lee, Eric P.; Seita, Jun; Sahoo, Debashis; Park, Seung-min; Weissman, Irving L.; Lee, Luke P.

    2014-01-01

    Discriminating cellular heterogeneity is important for understanding cellular physiology. However, it is limited by the technical difficulties of single-cell measurements. Here, we develop a two-stage system to determine cellular heterogeneity. In the first stage, we perform multiplex single-cell RNA-cytometry in a microwell array containing over 60,000 reaction chambers. In the second stage, we use the RNA-cytometry data to determine cellular heterogeneity by providing a heterogeneity likelihood score. Moreover, we use Monte-Carlo simulation and RNA-cytometry data to calculate the minimum number of cells required for detecting heterogeneity. We applied this system to characterize the RNA distributions of aging related genes in a highly purified mouse hematopoietic stem cell population. We identified genes that reveal novel heterogeneity of these cells. We also show that changes in expression of genes such as Birc6 during aging can be attributed to the shift of relative portions of cells in the high-expressing subgroup versus low-expressing subgroup. PMID:24667995

  6. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iandola, F N; O'Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improvesmore » usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.« less

  7. Self-learning Monte Carlo method

    DOE PAGES

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; ...

    2017-01-04

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. Lastly, we demonstrate the efficiency of SLMC in a spin model at the phasemore » transition point, achieving a 10–20 times speedup.« less

  8. Monte Carlo Simulation of a Segmented Detector for Low-Energy Electron Antineutrinos

    NASA Astrophysics Data System (ADS)

    Qomi, H. Akhtari; Safari, M. J.; Davani, F. Abbasi

    2017-11-01

    Detection of low-energy electron antineutrinos is of importance for several purposes, such as ex-vessel reactor monitoring, neutrino oscillation studies, etc. The inverse beta decay (IBD) is the interaction that is responsible for detection mechanism in (organic) plastic scintillation detectors. Here, a detailed study will be presented dealing with the radiation and optical transport simulation of a typical segmented antineutrino detector withMonte Carlo method using MCNPX and FLUKA codes. This study shows different aspects of the detector, benefiting from inherent capabilities of the Monte Carlo simulation codes.

  9. APS undulator and wiggler sources: Monte-Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, S.L.; Lai, B.; Viccaro, P.J.

    1992-02-01

    Standard insertion devices will be provided to each sector by the Advanced Photon Source. It is important to define the radiation characteristics of these general purpose devices. In this document,results of Monte-Carlo simulation are presented. These results, based on the SHADOW program, include the APS Undulator A (UA), Wiggler A (WA), and Wiggler B (WB).

  10. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  11. DNA double strand break (DSB) induction and cell survival in iodine-enhanced computed tomography (CT)

    NASA Astrophysics Data System (ADS)

    Streitmatter, Seth W.; Stewart, Robert D.; Jenkins, Peter A.; Jevremovic, Tatjana

    2017-08-01

    A multi-scale Monte Carlo model is proposed to assess the dosimetric and biological impact of iodine-based contrast agents commonly used in computed tomography. As presented, the model integrates the general purpose MCNP6 code system for larger-scale radiation transport and dose assessment with the Monte Carlo damage simulation to determine the sub-cellular characteristics and spatial distribution of initial DNA damage. The repair-misrepair-fixation model is then used to relate DNA double strand break (DSB) induction to reproductive cell death. Comparisons of measured and modeled changes in reproductive cell survival for ultrasoft characteristic k-shell x-rays (0.25-4.55 keV) up to orthovoltage (200-500 kVp) x-rays indicate that the relative biological effectiveness (RBE) for DSB induction is within a few percent of the RBE for cell survival. Because of the very short range of secondary electrons produced by low energy x-ray interactions with contrast agents, the concentration and subcellular distribution of iodine within and near cellular targets have a significant impact on the estimated absorbed dose and number of DSB produced in the cell nucleus. For some plausible models of the cell-level distribution of contrast agent, the model predicts an increase in RBE-weighted dose (RWD) for the endpoint of DSB induction of 1.22-1.40 for a 5-10 mg ml-1 iodine concentration in blood compared to an RWD increase of 1.07  ±  0.19 from a recent clinical trial. The modeled RWD of 2.58  ±  0.03 is also in good agreement with the measured RWD of 2.3  ±  0.5 for an iodine concentration of 50 mg ml-1 relative to no iodine. The good agreement between modeled and measured DSB and cell survival estimates provides some confidence that the presented model can be used to accurately assess biological dose for other concentrations of the same or different contrast agents.

  12. Kinetic Monte Carlo Method for Rule-based Modeling of Biochemical Networks

    PubMed Central

    Yang, Jin; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.

    2009-01-01

    We present a kinetic Monte Carlo method for simulating chemical transformations specified by reaction rules, which can be viewed as generators of chemical reactions, or equivalently, definitions of reaction classes. A rule identifies the molecular components involved in a transformation, how these components change, conditions that affect whether a transformation occurs, and a rate law. The computational cost of the method, unlike conventional simulation approaches, is independent of the number of possible reactions, which need not be specified in advance or explicitly generated in a simulation. To demonstrate the method, we apply it to study the kinetics of multivalent ligand-receptor interactions. We expect the method will be useful for studying cellular signaling systems and other physical systems involving aggregation phenomena. PMID:18851068

  13. An efficient Cellular Potts Model algorithm that forbids cell fragmentation

    NASA Astrophysics Data System (ADS)

    Durand, Marc; Guesnet, Etienne

    2016-11-01

    The Cellular Potts Model (CPM) is a lattice based modeling technique which is widely used for simulating cellular patterns such as foams or biological tissues. Despite its realism and generality, the standard Monte Carlo algorithm used in the scientific literature to evolve this model preserves connectivity of cells on a limited range of simulation temperature only. We present a new algorithm in which cell fragmentation is forbidden for all simulation temperatures. This allows to significantly enhance realism of the simulated patterns. It also increases the computational efficiency compared with the standard CPM algorithm even at same simulation temperature, thanks to the time spared in not doing unrealistic moves. Moreover, our algorithm restores the detailed balance equation, ensuring that the long-term stage is independent of the chosen acceptance rate and chosen path in the temperature space.

  14. An NCME Instructional Module on Estimating Item Response Theory Models Using Markov Chain Monte Carlo Methods

    ERIC Educational Resources Information Center

    Kim, Jee-Seon; Bolt, Daniel M.

    2007-01-01

    The purpose of this ITEMS module is to provide an introduction to Markov chain Monte Carlo (MCMC) estimation for item response models. A brief description of Bayesian inference is followed by an overview of the various facets of MCMC algorithms, including discussion of prior specification, sampling procedures, and methods for evaluating chain…

  15. Cellular reprogramming dynamics follow a simple 1D reaction coordinate

    NASA Astrophysics Data System (ADS)

    Teja Pusuluri, Sai; Lang, Alex H.; Mehta, Pankaj; Castillo, Horacio E.

    2018-01-01

    Cellular reprogramming, the conversion of one cell type to another, induces global changes in gene expression involving thousands of genes, and understanding how cells globally alter their gene expression profile during reprogramming is an ongoing problem. Here we reanalyze time-course data on cellular reprogramming from differentiated cell types to induced pluripotent stem cells (iPSCs) and show that gene expression dynamics during reprogramming follow a simple 1D reaction coordinate. This reaction coordinate is independent of both the time it takes to reach the iPSC state as well as the details of the experimental protocol used. Using Monte-Carlo simulations, we show that such a reaction coordinate emerges from epigenetic landscape models where cellular reprogramming is viewed as a ‘barrier-crossing’ process between cell fates. Overall, our analysis and model suggest that gene expression dynamics during reprogramming follow a canonical trajectory consistent with the idea of an ‘optimal path’ in gene expression space for reprogramming.

  16. Accelerate quasi Monte Carlo method for solving systems of linear algebraic equations through shared memory

    NASA Astrophysics Data System (ADS)

    Lai, Siyan; Xu, Ying; Shao, Bo; Guo, Menghan; Lin, Xiaola

    2017-04-01

    In this paper we study on Monte Carlo method for solving systems of linear algebraic equations (SLAE) based on shared memory. Former research demostrated that GPU can effectively speed up the computations of this issue. Our purpose is to optimize Monte Carlo method simulation on GPUmemoryachritecture specifically. Random numbers are organized to storein shared memory, which aims to accelerate the parallel algorithm. Bank conflicts can be avoided by our Collaborative Thread Arrays(CTA)scheme. The results of experiments show that the shared memory based strategy can speed up the computaions over than 3X at most.

  17. Physical limits to autofluorescence signals in vivo recordings in the rat olfactory bulb: a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    L'Heureux, B.; Gurden, H.; Pinot, L.; Mastrippolito, R.; Lefebvre, F.; Lanièce, P.; Pain, F.

    2007-07-01

    Understanding the cellular mechanisms of energy supply to neurons following physiological activation is still challenging and has strong implications to the interpretation of clinical functional images based on metabolic signals such as Blood Oxygen Level Dependent Magnetic Resonance Imaging or 18F-Fluorodexoy-Glucose Positron Emission Tomography. Intrinsic Optical Signal Imaging provides with high spatio temporal resolution in vivo imaging in the anaesthetized rat. In that context, intrinsic signals are mainly related to changes in the optical absorption of haemoglobin depending on its oxygenation state. This technique has been validated for imaging of the rat olfactory bulb, providing with maps of the actived olfactory glomeruli, the functional modules involved in the first step of olfactory coding. A complementary approach would be autofluorescence imaging relying on the fluorescence properties of endogenous Flavin Adenine Dinucleotide (FAD) or Nicotinamide Adenine Dinucleotide (NADH) both involved in intracellular metabolic pathways. The purpose of the present study was to investigate the feasibility of in vivo autofluorescence imaging in the rat olfactory bulb. We performed standard Monte Carlo simulations of photons scattering and absorption at the excitation and emission wavelengths of FAD and NADH fluorescence. Characterization of the fluorescence distribution in the glomerulus, effect of hemoglobin absorption at the excitation and absorption wavelengths as well as the effect of the blurring due to photon scattering and the depth of focus of the optical apparatus have been studied. Finally, optimal experimental parameters are proposed to achieve in vivo validation of the technique in the rat olfactory bulb.

  18. Joint simulation of regional areas burned in Canadian forest fires: A Markov Chain Monte Carlo approach

    Treesearch

    Steen Magnussen

    2009-01-01

    Areas burned annually in 29 Canadian forest fire regions show a patchy and irregular correlation structure that significantly influences the distribution of annual totals for Canada and for groups of regions. A binary Monte Carlo Markov Chain (MCMC) is constructed for the purpose of joint simulation of regional areas burned in forest fires. For each year the MCMC...

  19. Self-Learning Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of general and efficient update algorithm for large size systems close to phase transition or with strong frustrations, for which local updates perform badly. In this work, we propose a new general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup. This work is supported by the DOE Office of Basic Energy Sciences, Division of Materials Sciences and Engineering under Award DE-SC0010526.

  20. Positron follow-up in liquid water: I. A new Monte Carlo track-structure code.

    PubMed

    Champion, C; Le Loirec, C

    2006-04-07

    When biological matter is irradiated by charged particles, a wide variety of interactions occur, which lead to a deep modification of the cellular environment. To understand the fine structure of the microscopic distribution of energy deposits, Monte Carlo event-by-event simulations are particularly suitable. However, the development of these track-structure codes needs accurate interaction cross sections for all the electronic processes: ionization, excitation, positronium formation and even elastic scattering. Under these conditions, we have recently developed a Monte Carlo code for positrons in water, the latter being commonly used to simulate the biological medium. All the processes are studied in detail via theoretical differential and total cross-section calculations performed by using partial wave methods. Comparisons with existing theoretical and experimental data in terms of stopping powers, mean energy transfers and ranges show very good agreements. Moreover, thanks to the theoretical description of positronium formation, we have access, for the first time, to the complete kinematics of the electron capture process. Then, the present Monte Carlo code is able to describe the detailed positronium history, which will provide useful information for medical imaging (like positron emission tomography) where improvements are needed to define with the best accuracy the tumoural volumes.

  1. NOTE: Monte Carlo evaluation of kerma in an HDR brachytherapy bunker

    NASA Astrophysics Data System (ADS)

    Pérez-Calatayud, J.; Granero, D.; Ballester, F.; Casal, E.; Crispin, V.; Puchades, V.; León, A.; Verdú, G.

    2004-12-01

    In recent years, the use of high dose rate (HDR) after-loader machines has greatly increased due to the shift from traditional Cs-137/Ir-192 low dose rate (LDR) to HDR brachytherapy. The method used to calculate the required concrete and, where appropriate, lead shielding in the door is based on analytical methods provided by documents published by the ICRP, the IAEA and the NCRP. The purpose of this study is to perform a more realistic kerma evaluation at the entrance maze door of an HDR bunker using the Monte Carlo code GEANT4. The Monte Carlo results were validated experimentally. The spectrum at the maze entrance door, obtained with Monte Carlo, has an average energy of about 110 keV, maintaining a similar value along the length of the maze. The comparison of results from the aforementioned values with the Monte Carlo ones shows that results obtained using the albedo coefficient from the ICRP document more closely match those given by the Monte Carlo method, although the maximum value given by MC calculations is 30% greater.

  2. A High-Performance Cellular Automaton Model of Tumor Growth with Dynamically Growing Domains

    PubMed Central

    Poleszczuk, Jan; Enderling, Heiko

    2014-01-01

    Tumor growth from a single transformed cancer cell up to a clinically apparent mass spans many spatial and temporal orders of magnitude. Implementation of cellular automata simulations of such tumor growth can be straightforward but computing performance often counterbalances simplicity. Computationally convenient simulation times can be achieved by choosing appropriate data structures, memory and cell handling as well as domain setup. We propose a cellular automaton model of tumor growth with a domain that expands dynamically as the tumor population increases. We discuss memory access, data structures and implementation techniques that yield high-performance multi-scale Monte Carlo simulations of tumor growth. We discuss tumor properties that favor the proposed high-performance design and present simulation results of the tumor growth model. We estimate to which parameters the model is the most sensitive, and show that tumor volume depends on a number of parameters in a non-monotonic manner. PMID:25346862

  3. Multistep Lattice-Voxel method utilizing lattice function for Monte-Carlo treatment planning with pixel based voxel model.

    PubMed

    Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K

    2011-12-01

    Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    PubMed

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  6. A Monte Carlo study of macroscopic and microscopic dose descriptors for kilovoltage cellular dosimetry

    NASA Astrophysics Data System (ADS)

    Oliver, P. A. K.; Thomson, Rowan M.

    2017-02-01

    This work investigates how doses to cellular targets depend on cell morphology, as well as relations between cellular doses and doses to bulk tissues and water. Multicellular models of five healthy and cancerous soft tissues are developed based on typical values of cell compartment sizes, elemental compositions and number densities found in the literature. Cells are modelled as two concentric spheres with nucleus and cytoplasm compartments. Monte Carlo simulations are used to calculate the absorbed dose to the nucleus and cytoplasm for incident photon energies of 20-370 keV, relevant for brachytherapy, diagnostic radiology, and out-of-field radiation in higher-energy external beam radiotherapy. Simulations involving cell clusters, single cells and single nuclear cavities are carried out for cell radii between 5 and 10~μ m, and nuclear radii between 2 and 9~μ m. Seven nucleus and cytoplasm elemental compositions representative of animal cells are considered. The presence of a cytoplasm, extracellular matrix and surrounding cells can affect the nuclear dose by up to 13 % . Differences in cell and nucleus size can affect dose to the nucleus (cytoplasm) of the central cell in a cluster of 13 cells by up to 13 % (8 % ). Furthermore, the results of this study demonstrate that neither water nor bulk tissue are reliable substitutes for subcellular targets for incident photon energies  <50 keV: nuclear (cytoplasm) doses differ from dose-to-medium by up to 32 % (18 % ), and from dose-to-water by up to 21 % (8 % ). The largest differences between dose descriptors are seen for the lowest incident photon energies; differences are less than 3 % for energies ≥slant 90 keV. The sensitivity of results with regard to the parameters of the microscopic tissue structure model and cell model geometry, and the importance of the nucleus and cytoplasm as targets for radiation-induced cell death emphasize the importance of accurate models for cellular dosimetry studies.

  7. Nanoparticle Contrast Agents for Enhanced Microwave Imaging and Thermal Treatment of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    continue to increase in step with de - creasing critical dimensions, electrodynamic effects directly influence high-frequency device performance, and...computational burden is significant. The Cellular Monte Carlo (CMC) method, originally de - veloped by Kometer et al. [50], was designed to reduce this...combination of a full-wave FDTD solver with a de - vice simulator based upon a stochastic transport kernel is conceptually straightforward, but the

  8. Game of Life on the Equal Degree Random Lattice

    NASA Astrophysics Data System (ADS)

    Shao, Zhi-Gang; Chen, Tao

    2010-12-01

    An effective matrix method is performed to build the equal degree random (EDR) lattice, and then a cellular automaton game of life on the EDR lattice is studied by Monte Carlo (MC) simulation. The standard mean field approximation (MFA) is applied, and then the density of live cells is given ρ=0.37017 by MFA, which is consistent with the result ρ=0.37±0.003 by MC simulation.

  9. An Expanded Multi-scale Monte Carlo Simulation Method for Personalized Radiobiological Effect Estimation in Radiotherapy: a feasibility study

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Feng, Yuanming; Wang, Wei; Yang, Chengwen; Wang, Ping

    2017-03-01

    A novel and versatile “bottom-up” approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy.

  10. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry.

    PubMed

    Bostani, Maryam; Mueller, Jonathon W; McMillan, Kyle; Cody, Dianna D; Cagnon, Chris H; DeMarco, John J; McNitt-Gray, Michael F

    2015-02-01

    The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. The calculated mean percent difference between TLD measurements and Monte Carlo simulations was -4.9% with standard deviation of 8.7% and a range of -22.7% to 5.7%. The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.

  11. A coarse-grained Monte Carlo approach to diffusion processes in metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hauser, Andreas W.; Schnedlitz, Martin; Ernst, Wolfgang E.

    2017-06-01

    A kinetic Monte Carlo approach on a coarse-grained lattice is developed for the simulation of surface diffusion processes of Ni, Pd and Au structures with diameters in the range of a few nanometers. Intensity information obtained via standard two-dimensional transmission electron microscopy imaging techniques is used to create three-dimensional structure models as input for a cellular automaton. A series of update rules based on reaction kinetics is defined to allow for a stepwise evolution in time with the aim to simulate surface diffusion phenomena such as Rayleigh breakup and surface wetting. The material flow, in our case represented by the hopping of discrete portions of metal on a given grid, is driven by the attempt to minimize the surface energy, which can be achieved by maximizing the number of filled neighbor cells.

  12. Analytic variance estimates of Swank and Fano factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Benjamin; Badano, Aldo; Samuelson, Frank, E-mail: frank.samuelson@fda.hhs.gov

    Purpose: Variance estimates for detector energy resolution metrics can be used as stopping criteria in Monte Carlo simulations for the purpose of ensuring a small uncertainty of those metrics and for the design of variance reduction techniques. Methods: The authors derive an estimate for the variance of two energy resolution metrics, the Swank factor and the Fano factor, in terms of statistical moments that can be accumulated without significant computational overhead. The authors examine the accuracy of these two estimators and demonstrate how the estimates of the coefficient of variation of the Swank and Fano factors behave with data frommore » a Monte Carlo simulation of an indirect x-ray imaging detector. Results: The authors' analyses suggest that the accuracy of their variance estimators is appropriate for estimating the actual variances of the Swank and Fano factors for a variety of distributions of detector outputs. Conclusions: The variance estimators derived in this work provide a computationally convenient way to estimate the error or coefficient of variation of the Swank and Fano factors during Monte Carlo simulations of radiation imaging systems.« less

  13. Delivery of Nano-Tethered Therapies to Brain Metastases of Primary Breast Cancer Using a Cellular Trojan Horse

    DTIC Science & Technology

    2015-12-01

    Hounsfield units (HU) of the brain were translated into corresponding optical properties (absorption coefficient, scattering coefficient, and anisotropy...factor) using lookup tables (Fig 2). The lookup tables were prepared from earlier studies which derived the Hounsfield units and optical properties of... Hounsfield Units /HU) are segmented and translated into optical properties of the brain tissue (white/gray matter, CSF, skull bone, etc.). Monte

  14. Sci-Thur AM: YIS – 06: A Monte Carlo study of macro- and microscopic dose descriptors and the microdosimetric spread using detailed cellular models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliver, Patricia; Thomson, Rowan

    2016-08-15

    Purpose: To develop Monte Carlo models of cell clusters to investigate the relationships between macro- and microscopic dose descriptors, quantify the microdosimetric spread in energy deposition for subcellular targets, and determine how these results depend on the computational model. Methods: Microscopic tissue structure is modelled as clusters of 13 to 150 cells, with cell (nuclear) radii between 5 and 10 microns (2 and 9 microns). Energy imparted per unit mass (specific energy or dose) is scored in the nucleus (D{sub nuc}) and cytoplasm (D{sub cyt}) for incident photon energies from 20 to 370 keV. Dose-to-water (D{sub w,m}) and dose-to-medium (D{submore » m,m}) are compared to D{sub nuc} and D{sub cyt}. Single cells and single nuclear cavities are also simulated. Results: D{sub nuc} and D{sub cyt} are sensitive to the surrounding environment with deviations of up to 13% for a single nucleus/cell compared with a multicellular cluster. These dose descriptors vary with cell and nucleus size by up to 10%. D{sub nuc} and D{sub cyt} differ from D{sub w,m} and D{sub m,m} by up to 32%. The microdosimetric spread is sensitive to whether cells are arranged randomly or in a hexagonal lattice, and whether subcellular compartment sizes are sampled from a normal distribution or are constant throughout the cluster. Conclusions: D{sub nuc} and D{sub cyt} are sensitive to cell morphology, elemental composition and the presence of surrounding cells. The microdosimetric spread was investigated using realistic elemental compositions for the nucleus and cytoplasm, and depends strongly on subcellular compartment size, source energy and dose.« less

  15. The proton therapy nozzles at Samsung Medical Center: A Monte Carlo simulation study using TOPAS

    NASA Astrophysics Data System (ADS)

    Chung, Kwangzoo; Kim, Jinsung; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih

    2015-07-01

    To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles by using TOol for PArticle Simulation (TOPAS). At SMC proton therapy center, we have two gantry rooms with different types of nozzles: a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, the novel features of TOPAS, such as the time feature or the ridge filter class, have been used, and the appropriate physics models for proton nozzle simulation have been defined. Dosimetric properties, like percent depth dose curve, spreadout Bragg peak (SOBP), and beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported radiotherapy (RT) plan from the TPS is interpreted by using an interface and is then translated into the TOPAS input text. The developed Monte Carlo nozzle model can be used to estimate the non-beam performance, such as the neutron background, of the nozzles. Furthermore, the nozzle model can be used to study the mechanical optimization of the design of the nozzle.

  16. SU-E-T-586: Field Size Dependence of Output Factor for Uniform Scanning Proton Beams: A Comparison of TPS Calculation, Measurement and Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Y; Singh, H; Islam, M

    2014-06-01

    Purpose: Output dependence on field size for uniform scanning beams, and the accuracy of treatment planning system (TPS) calculation are not well studied. The purpose of this work is to investigate the dependence of output on field size for uniform scanning beams and compare it among TPS calculation, measurements and Monte Carlo simulations. Methods: Field size dependence was studied using various field sizes between 2.5 cm diameter to 10 cm diameter. The field size factor was studied for a number of proton range and modulation combinations based on output at the center of spread out Bragg peak normalized to amore » 10 cm diameter field. Three methods were used and compared in this study: 1) TPS calculation, 2) ionization chamber measurement, and 3) Monte Carlos simulation. The XiO TPS (Electa, St. Louis) was used to calculate the output factor using a pencil beam algorithm; a pinpoint ionization chamber was used for measurements; and the Fluka code was used for Monte Carlo simulations. Results: The field size factor varied with proton beam parameters, such as range, modulation, and calibration depth, and could decrease over 10% from a 10 cm to 3 cm diameter field for a large range proton beam. The XiO TPS predicted the field size factor relatively well at large field size, but could differ from measurements by 5% or more for small field and large range beams. Monte Carlo simulations predicted the field size factor within 1.5% of measurements. Conclusion: Output factor can vary largely with field size, and needs to be accounted for accurate proton beam delivery. This is especially important for small field beams such as in stereotactic proton therapy, where the field size dependence is large and TPS calculation is inaccurate. Measurements or Monte Carlo simulations are recommended for output determination for such cases.« less

  17. Path integral Monte Carlo ground state approach: formalism, implementation, and applications

    NASA Astrophysics Data System (ADS)

    Yan, Yangqian; Blume, D.

    2017-11-01

    Monte Carlo techniques have played an important role in understanding strongly correlated systems across many areas of physics, covering a wide range of energy and length scales. Among the many Monte Carlo methods applicable to quantum mechanical systems, the path integral Monte Carlo approach with its variants has been employed widely. Since semi-classical or classical approaches will not be discussed in this review, path integral based approaches can for our purposes be divided into two categories: approaches applicable to quantum mechanical systems at zero temperature and approaches applicable to quantum mechanical systems at finite temperature. While these two approaches are related to each other, the underlying formulation and aspects of the algorithm differ. This paper reviews the path integral Monte Carlo ground state (PIGS) approach, which solves the time-independent Schrödinger equation. Specifically, the PIGS approach allows for the determination of expectation values with respect to eigen states of the few- or many-body Schrödinger equation provided the system Hamiltonian is known. The theoretical framework behind the PIGS algorithm, implementation details, and sample applications for fermionic systems are presented.

  18. Targeting mitochondria in cancer cells using gold nanoparticle-enhanced radiotherapy: A Monte Carlo study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirkby, Charles, E-mail: charles.kirkby@albertahealthservices.ca; Ghasroddashti, Esmaeel; Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4

    2015-02-15

    Purpose: Radiation damage to mitochondria has been shown to alter cellular processes and even lead to apoptosis. Gold nanoparticles (AuNPs) may be used to enhance these effects in scenarios where they collect on the outer membranes of mitochondria. A Monte Carlo (MC) approach is used to estimate mitochondrial dose enhancement under a variety of conditions. Methods: The PENELOPE MC code was used to generate dose distributions resulting from photons striking a 13 nm diameter AuNP with various thicknesses of water-equivalent coatings. Similar dose distributions were generated with the AuNP replaced by water so as to estimate the gain in dosemore » on a microscopic scale due to the presence of AuNPs within an irradiated volume. Models of mitochondria with AuNPs affixed to their outer membrane were then generated—considering variation in mitochondrial size and shape, number of affixed AuNPs, and AuNP coating thickness—and exposed (in a dose calculation sense) to source spectra ranging from 6 MV to 90 kVp. Subsequently dose enhancement ratios (DERs), or the dose with the AuNPs present to that for no AuNPs, for the entire mitochondrion and its components were tallied under these scenarios. Results: For a representative case of a 1000 nm diameter mitochondrion affixed with 565 AuNPs, each with a 13 nm thick coating, the mean DER over the whole organelle ranged from roughly 1.1 to 1.6 for the kilovoltage sources, but was generally less than 1.01 for the megavoltage sources. The outer membrane DERs remained less than 1.01 for the megavoltage sources, but rose to 2.3 for 90 kVp. The voxel maximum DER values were as high as 8.2 for the 90 kVp source and increased further when the particles clustered together. The DER exhibited dependence on the mitochondrion dimensions, number of AuNPs, and the AuNP coating thickness. Conclusions: Substantial dose enhancement directly to the mitochondria can be achieved under the conditions modeled. If the mitochondrion dose can be directly enhanced, as these simulations show, this work suggests the potential for both a tool to study the role of mitochondria in cellular response to radiation and a novel avenue for radiation therapy in that the mitochondria may be targeted, rather than the nuclear DNA.« less

  19. Feasibility of antihydrogen atom containment in helium: a problem of electron-positron correlation investigated by the Monte Carol method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackman, T.M.

    1987-01-01

    A theoretical investigation of the interaction potential between the helium atom and the antihydrogen atom was performed for the purpose of determining the feasibility of antihydrogen atom containment. The interaction potential showed an energy barrier to collapse of this system. A variational estimate of the height of this energy barrier and estimates of lifetime with respect to electron-positron annihilation were determined by the Variational Monte Carlo method. This calculation allowed for an improvement over an SCF result through the inclusion of explicit correlation factors in the trial wave function. An estimate of the correlation energy of this system was determinedmore » by the Green's Function Monte Carlo (GFMC) method.« less

  20. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    DTIC Science & Technology

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  1. Shielding analyses of an AB-BNCT facility using Monte Carlo simulations and simplified methods

    NASA Astrophysics Data System (ADS)

    Lai, Bo-Lun; Sheu, Rong-Jiun

    2017-09-01

    Accurate Monte Carlo simulations and simplified methods were used to investigate the shielding requirements of a hypothetical accelerator-based boron neutron capture therapy (AB-BNCT) facility that included an accelerator room and a patient treatment room. The epithermal neutron beam for BNCT purpose was generated by coupling a neutron production target with a specially designed beam shaping assembly (BSA), which was embedded in the partition wall between the two rooms. Neutrons were produced from a beryllium target bombarded by 1-mA 30-MeV protons. The MCNP6-generated surface sources around all the exterior surfaces of the BSA were established to facilitate repeated Monte Carlo shielding calculations. In addition, three simplified models based on a point-source line-of-sight approximation were developed and their predictions were compared with the reference Monte Carlo results. The comparison determined which model resulted in better dose estimation, forming the basis of future design activities for the first ABBNCT facility in Taiwan.

  2. Efficient Simulation of Secondary Fluorescence Via NIST DTSA-II Monte Carlo.

    PubMed

    Ritchie, Nicholas W M

    2017-06-01

    Secondary fluorescence, the final term in the familiar matrix correction triumvirate Z·A·F, is the most challenging for Monte Carlo models to simulate. In fact, only two implementations of Monte Carlo models commonly used to simulate electron probe X-ray spectra can calculate secondary fluorescence-PENEPMA and NIST DTSA-II a (DTSA-II is discussed herein). These two models share many physical models but there are some important differences in the way each implements X-ray emission including secondary fluorescence. PENEPMA is based on PENELOPE, a general purpose software package for simulation of both relativistic and subrelativistic electron/positron interactions with matter. On the other hand, NIST DTSA-II was designed exclusively for simulation of X-ray spectra generated by subrelativistic electrons. NIST DTSA-II uses variance reduction techniques unsuited to general purpose code. These optimizations help NIST DTSA-II to be orders of magnitude more computationally efficient while retaining detector position sensitivity. Simulations execute in minutes rather than hours and can model differences that result from detector position. Both PENEPMA and NIST DTSA-II are capable of handling complex sample geometries and we will demonstrate that both are of similar accuracy when modeling experimental secondary fluorescence data from the literature.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudhyadhom, A; McGuinness, C; Descovich, M

    Purpose: To develop a methodology for validation of a Monte-Carlo dose calculation model for robotic small field SRS/SBRT deliveries. Methods: In a robotic treatment planning system, a Monte-Carlo model was iteratively optimized to match with beam data. A two-part analysis was developed to verify this model. 1) The Monte-Carlo model was validated in a simulated water phantom versus a Ray-Tracing calculation on a single beam collimator-by-collimator calculation. 2) The Monte-Carlo model was validated to be accurate in the most challenging situation, lung, by acquiring in-phantom measurements. A plan was created and delivered in a CIRS lung phantom with film insert.more » Separately, plans were delivered in an in-house created lung phantom with a PinPoint chamber insert within a lung simulating material. For medium to large collimator sizes, a single beam was delivered to the phantom. For small size collimators (10, 12.5, and 15mm), a robotically delivered plan was created to generate a uniform dose field of irradiation over a 2×2cm{sup 2} area. Results: Dose differences in simulated water between Ray-Tracing and Monte-Carlo were all within 1% at dmax and deeper. Maximum dose differences occurred prior to dmax but were all within 3%. Film measurements in a lung phantom show high correspondence of over 95% gamma at the 2%/2mm level for Monte-Carlo. Ion chamber measurements for collimator sizes of 12.5mm and above were within 3% of Monte-Carlo calculated values. Uniform irradiation involving the 10mm collimator resulted in a dose difference of ∼8% for both Monte-Carlo and Ray-Tracing indicating that there may be limitations with the dose calculation. Conclusion: We have developed a methodology to validate a Monte-Carlo model by verifying that it matches in water and, separately, that it corresponds well in lung simulating materials. The Monte-Carlo model and algorithm tested may have more limited accuracy for 10mm fields and smaller.« less

  4. A Monte Carlo method for the simulation of coagulation and nucleation based on weighted particles and the concepts of stochastic resolution and merging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kotalczyk, G., E-mail: Gregor.Kotalczyk@uni-due.de; Kruis, F.E.

    Monte Carlo simulations based on weighted simulation particles can solve a variety of population balance problems and allow thus to formulate a solution-framework for many chemical engineering processes. This study presents a novel concept for the calculation of coagulation rates of weighted Monte Carlo particles by introducing a family of transformations to non-weighted Monte Carlo particles. The tuning of the accuracy (named ‘stochastic resolution’ in this paper) of those transformations allows the construction of a constant-number coagulation scheme. Furthermore, a parallel algorithm for the inclusion of newly formed Monte Carlo particles due to nucleation is presented in the scope ofmore » a constant-number scheme: the low-weight merging. This technique is found to create significantly less statistical simulation noise than the conventional technique (named ‘random removal’ in this paper). Both concepts are combined into a single GPU-based simulation method which is validated by comparison with the discrete-sectional simulation technique. Two test models describing a constant-rate nucleation coupled to a simultaneous coagulation in 1) the free-molecular regime or 2) the continuum regime are simulated for this purpose.« less

  5. Case Study: The Mystery of the Seven Deaths--A Case Study in Cellular Respiration

    ERIC Educational Resources Information Center

    Gazdik, Michaela

    2014-01-01

    Cellular respiration, the central component of cellular metabolism, can be a difficult concept for many students to fully understand. In this interrupted, problem-based case study, students explore the purpose of cellular respiration as they play the role of medical examiner, analyzing autopsy evidence to determine the mysterious cause of death…

  6. Influence of time dependent longitudinal magnetic fields on the cooling process, exchange bias and magnetization reversal mechanism in FM core/AFM shell nanoparticles: a Monte Carlo study.

    PubMed

    Yüksel, Yusuf; Akıncı, Ümit

    2016-12-07

    Using Monte Carlo simulations, we have investigated the dynamic phase transition properties of magnetic nanoparticles with ferromagnetic core coated by an antiferromagnetic shell structure. Effects of field amplitude and frequency on the thermal dependence of magnetizations, magnetization reversal mechanisms during hysteresis cycles, as well as on the exchange bias and coercive fields have been examined, and the feasibility of applying dynamic magnetic fields on the particle have been discussed for technological and biomedical purposes.

  7. An empirical approach to estimate near-infra-red photon propagation and optically induced drug release in brain tissues

    NASA Astrophysics Data System (ADS)

    Prabhu Verleker, Akshay; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M.

    2015-03-01

    The purpose of this study is to develop an alternate empirical approach to estimate near-infra-red (NIR) photon propagation and quantify optically induced drug release in brain metastasis, without relying on computationally expensive Monte Carlo techniques (gold standard). Targeted drug delivery with optically induced drug release is a noninvasive means to treat cancers and metastasis. This study is part of a larger project to treat brain metastasis by delivering lapatinib-drug-nanocomplexes and activating NIR-induced drug release. The empirical model was developed using a weighted approach to estimate photon scattering in tissues and calibrated using a GPU based 3D Monte Carlo. The empirical model was developed and tested against Monte Carlo in optical brain phantoms for pencil beams (width 1mm) and broad beams (width 10mm). The empirical algorithm was tested against the Monte Carlo for different albedos along with diffusion equation and in simulated brain phantoms resembling white-matter (μs'=8.25mm-1, μa=0.005mm-1) and gray-matter (μs'=2.45mm-1, μa=0.035mm-1) at wavelength 800nm. The goodness of fit between the two models was determined using coefficient of determination (R-squared analysis). Preliminary results show the Empirical algorithm matches Monte Carlo simulated fluence over a wide range of albedo (0.7 to 0.99), while the diffusion equation fails for lower albedo. The photon fluence generated by empirical code matched the Monte Carlo in homogeneous phantoms (R2=0.99). While GPU based Monte Carlo achieved 300X acceleration compared to earlier CPU based models, the empirical code is 700X faster than the Monte Carlo for a typical super-Gaussian laser beam.

  8. Phenomenology based multiscale models as tools to understand cell membrane and organelle morphologies

    PubMed Central

    Ramakrishnan, N.; Radhakrishnan, Ravi

    2016-01-01

    An intriguing question in cell biology is “how do cells regulate their shape?” It is commonly believed that the observed cellular morphologies are a result of the complex interaction among the lipid molecules (constituting the cell membrane), and with a number of other macromolecules, such as proteins. It is also believed that the common biophysical processes essential for the functioning of a cell also play an important role in cellular morphogenesis. At the cellular scale—where typical dimensions are in the order of micrometers—the effects arising from the molecular scale can either be modeled as equilibrium or non-equilibrium processes. In this chapter, we discuss the dynamically triangulated Monte Carlo technique to model and simulate membrane morphologies at the cellular scale, which in turn can be used to investigate several questions related to shape regulation in cells. In particular, we focus on two specific problems within the framework of isotropic and anisotropic elasticity theories: namely, (i) the origin of complex, physiologically relevant, membrane shapes due to the interaction of the membrane with curvature remodeling proteins, and (ii) the genesis of steady state cellular shapes due to the action of non-equilibrium forces that are generated by the fission and fusion of transport vesicles and by the binding and unbinding of proteins from the parent membrane. PMID:27087801

  9. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    PubMed

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  11. Parallel Markov chain Monte Carlo - bridging the gap to high-performance Bayesian computation in animal breeding and genetics.

    PubMed

    Wu, Xiao-Lin; Sun, Chuanyu; Beissinger, Timothy M; Rosa, Guilherme Jm; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2012-09-25

    Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs.

  12. Parallel Markov chain Monte Carlo - bridging the gap to high-performance Bayesian computation in animal breeding and genetics

    PubMed Central

    2012-01-01

    Background Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Results Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Conclusions Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs. PMID:23009363

  13. A TPS kernel for calculating survival vs. depth: distributions in a carbon radiotherapy beam, based on Katz's cellular Track Structure Theory.

    PubMed

    Waligórski, M P R; Grzanka, L; Korcyl, M; Olko, P

    2015-09-01

    An algorithm was developed of a treatment planning system (TPS) kernel for carbon radiotherapy in which Katz's Track Structure Theory of cellular survival (TST) is applied as its radiobiology component. The physical beam model is based on available tabularised data, prepared by Monte Carlo simulations of a set of pristine carbon beams of different input energies. An optimisation tool developed for this purpose is used to find the composition of pristine carbon beams of input energies and fluences which delivers a pre-selected depth-dose distribution profile over the spread-out Bragg peak (SOBP) region. Using an extrapolation algorithm, energy-fluence spectra of the primary carbon ions and of all their secondary fragments are obtained over regular steps of beam depths. To obtain survival vs. depth distributions, the TST calculation is applied to the energy-fluence spectra of the mixed field of primary ions and of their secondary products at the given beam depths. Katz's TST offers a unique analytical and quantitative prediction of cell survival in such mixed ion fields. By optimising the pristine beam composition to a published depth-dose profile over the SOBP region of a carbon beam and using TST model parameters representing the survival of CHO (Chinese Hamster Ovary) cells in vitro, it was possible to satisfactorily reproduce a published data set of CHO cell survival vs. depth measurements after carbon ion irradiation. The authors also show by a TST calculation that 'biological dose' is neither linear nor additive. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Monte Carlo track-structure calculations for aqueous solutions containing biomolecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, J.E.; Hamm, R.N.; Ritchie, R.H.

    1993-10-01

    Detailed Monte Carlo calculations provide a powerful tool for understanding mechanisms of radiation damage to biological molecules irradiated in aqueous solution. This paper describes the computer codes, OREC and RADLYS, which have been developed for this purpose over a number of years. Some results are given for calculations of the irradiation of pure water. comparisons are presented between computations for liquid water and water vapor. Detailed calculations of the chemical yields of several products from X-irradiated, oxygen-free glycylglycine solutions have been performed as a function of solute concentration. Excellent agreement is obtained between calculated and measured yields. The Monte Carlomore » analysis provides a complete mechanistic picture of pathways to observed radiolytic products. This approach, successful with glycylglycine, will be extended to study the irradiation of oligonucleotides in aqueous solution.« less

  15. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  16. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  17. ALGEBRA: ALgorithm for the heterogeneous dosimetry based on GEANT4 for BRAchytherapy.

    PubMed

    Afsharpour, H; Landry, G; D'Amours, M; Enger, S; Reniers, B; Poon, E; Carrier, J-F; Verhaegen, F; Beaulieu, L

    2012-06-07

    Task group 43 (TG43)-based dosimetry algorithms are efficient for brachytherapy dose calculation in water. However, human tissues have chemical compositions and densities different than water. Moreover, the mutual shielding effect of seeds on each other (interseed attenuation) is neglected in the TG43-based dosimetry platforms. The scientific community has expressed the need for an accurate dosimetry platform in brachytherapy. The purpose of this paper is to present ALGEBRA, a Monte Carlo platform for dosimetry in brachytherapy which is sufficiently fast and accurate for clinical and research purposes. ALGEBRA is based on the GEANT4 Monte Carlo code and is capable of handling the DICOM RT standard to recreate a virtual model of the treated site. Here, the performance of ALGEBRA is presented for the special case of LDR brachytherapy in permanent prostate and breast seed implants. However, the algorithm is also capable of handling other treatments such as HDR brachytherapy.

  18. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  19. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  20. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce

    PubMed Central

    Pratx, Guillem; Xing, Lei

    2011-01-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916

  1. Determination of correction factors in beta radiation beams using Monte Carlo method.

    PubMed

    Polo, Ivón Oramas; Santos, William de Souza; Caldas, Linda V E

    2018-06-15

    The absorbed dose rate is the main characterization quantity for beta radiation. The extrapolation chamber is considered the primary standard instrument. To determine absorbed dose rates in beta radiation beams, it is necessary to establish several correction factors. In this work, the correction factors for the backscatter due to the collecting electrode and to the guard ring, and the correction factor for Bremsstrahlung in beta secondary standard radiation beams are presented. For this purpose, the Monte Carlo method was applied. The results obtained are considered acceptable, and they agree within the uncertainties. The differences between the backscatter factors determined by the Monte Carlo method and those of the ISO standard were 0.6%, 0.9% and 2.04% for 90 Sr/ 90 Y, 85 Kr and 147 Pm sources respectively. The differences between the Bremsstrahlung factors determined by the Monte Carlo method and those of the ISO were 0.25%, 0.6% and 1% for 90 Sr/ 90 Y, 85 Kr and 147 Pm sources respectively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Geant4 hadronic physics for space radiation environment.

    PubMed

    Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L

    2012-01-01

    To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.

  3. Evaluation of an analytic linear Boltzmann transport equation solver for high-density inhomogeneities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, S. A. M.; Ansbacher, W.; Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 3P6

    2013-01-15

    Purpose: Acuros external beam (Acuros XB) is a novel dose calculation algorithm implemented through the ECLIPSE treatment planning system. The algorithm finds a deterministic solution to the linear Boltzmann transport equation, the same equation commonly solved stochastically by Monte Carlo methods. This work is an evaluation of Acuros XB, by comparison with Monte Carlo, for dose calculation applications involving high-density materials. Existing non-Monte Carlo clinical dose calculation algorithms, such as the analytic anisotropic algorithm (AAA), do not accurately model dose perturbations due to increased electron scatter within high-density volumes. Methods: Acuros XB, AAA, and EGSnrc based Monte Carlo are usedmore » to calculate dose distributions from 18 MV and 6 MV photon beams delivered to a cubic water phantom containing a rectangular high density (4.0-8.0 g/cm{sup 3}) volume at its center. The algorithms are also used to recalculate a clinical prostate treatment plan involving a unilateral hip prosthesis, originally evaluated using AAA. These results are compared graphically and numerically using gamma-index analysis. Radio-chromic film measurements are presented to augment Monte Carlo and Acuros XB dose perturbation data. Results: Using a 2% and 1 mm gamma-analysis, between 91.3% and 96.8% of Acuros XB dose voxels containing greater than 50% the normalized dose were in agreement with Monte Carlo data for virtual phantoms involving 18 MV and 6 MV photons, stainless steel and titanium alloy implants and for on-axis and oblique field delivery. A similar gamma-analysis of AAA against Monte Carlo data showed between 80.8% and 87.3% agreement. Comparing Acuros XB and AAA evaluations of a clinical prostate patient plan involving a unilateral hip prosthesis, Acuros XB showed good overall agreement with Monte Carlo while AAA underestimated dose on the upstream medial surface of the prosthesis due to electron scatter from the high-density material. Film measurements support the dose perturbations demonstrated by Monte Carlo and Acuros XB data. Conclusions: Acuros XB is shown to perform as well as Monte Carlo methods and better than existing clinical algorithms for dose calculations involving high-density volumes.« less

  4. The COOLER Code: A Novel Analytical Approach to Calculate Subcellular Energy Deposition by Internal Electron Emitters.

    PubMed

    Siragusa, Mattia; Baiocco, Giorgio; Fredericia, Pil M; Friedland, Werner; Groesser, Torsten; Ottolenghi, Andrea; Jensen, Mikael

    2017-08-01

    COmputation Of Local Electron Release (COOLER), a software program has been designed for dosimetry assessment at the cellular/subcellular scale, with a given distribution of administered low-energy electron-emitting radionuclides in cellular compartments, which remains a critical step in risk/benefit analysis for advancements in internal radiotherapy. The software is intended to overcome the main limitations of the medical internal radiation dose (MIRD) formalism for calculations of cellular S-values (i.e., dose to a target region in the cell per decay in a given source region), namely, the use of the continuous slowing down approximation (CSDA) and the assumption of a spherical cell geometry. To this aim, we developed an analytical approach, entrusted to a MATLAB-based program, using as input simulated data for electron spatial energy deposition directly derived from full Monte Carlo track structure calculations with PARTRAC. Results from PARTRAC calculations on electron range, stopping power and residual energy versus traveled distance curves are presented and, when useful for implementation in COOLER, analytical fit functions are given. Example configurations for cells in different culture conditions (V79 cells in suspension or adherent culture) with realistic geometrical parameters are implemented for use in the tool. Finally, cellular S-value predictions by the newly developed code are presented for different cellular geometries and activity distributions (uniform activity in the nucleus, in the entire cell or on the cell surface), validated against full Monte Carlo calculations with PARTRAC, and compared to MIRD standards, as well as results based on different track structure calculations (Geant4-DNA). The largest discrepancies between COOLER and MIRD predictions were generally found for electrons between 25 and 30 keV, where the magnitude of disagreement in S-values can vary from 50 to 100%, depending on the activity distribution. In calculations for activity distribution on the cell surface, MIRD predictions appeared to fail the most. The proposed method is suitable for Auger-cascade electrons, but can be extended to any energy of interest and to beta spectra; as an example, the 3 H case is also discussed. COOLER is intended to be accessible to everyone (preclinical and clinical researchers included), and may provide important information for the selection of radionuclides, the interpretation of radiobiological or preclinical results, and the general establishment of doses in any scenario, e.g., with cultured cells in the laboratory or with therapeutic or diagnostic applications. The software will be made available for download from the DTU-Nutech website: http://www.nutech.dtu.dk/ .

  5. Signatures of the Mott transition in the antiferromagnetic state of the two-dimensional Hubbard model

    DOE PAGES

    Fratino, L.; Sémon, P.; Charlebois, M.; ...

    2017-06-06

    The properties of a phase with large correlation length can be strongly influenced by the underlying normal phase. Here, we illustrate this by studying the half-filled two-dimensional Hubbard model using cellular dynamical mean-field theory with continuous-time quantum Monte Carlo. Sharp crossovers in the mechanism that favors antiferromagnetic correlations and in the corresponding local density of states are observed. We found that these crossovers occur at values of the interaction strength U and temperature T that are controlled by the underlying normal-state Mott transition.

  6. Dose response of surfactants to attenuate gas embolism related platelet aggregation

    NASA Astrophysics Data System (ADS)

    Eckmann, David M.; Eckmann, Yonaton Y.; Tomczyk, Nancy

    2014-03-01

    Intravascular gas embolism promotes blood clot formation, cellular activation, and adhesion events, particularly with platelets. Populating the interface with surfactants is a chemical-based intervention to reduce injury from gas embolism. We studied platelet activation and platelet aggregation, prominent adverse responses to blood contact with bubbles. We examined dose-response relationships for two chemically distinct surfactants to attenuate the rise in platelet function stimulated by exposure to microbubbles. Significant reduction in platelet aggregation and platelet activation occurred with increasing concentration of the surfactants, indicating presence of a saturable system. A population balance model for platelet aggregation in the presence of embolism bubbles and surfactants was developed. Monte Carlo simulations for platelet aggregation were performed. Results agree qualitatively with experimental findings. Surfactant dose-dependent reductions in platelet activation and aggregation indicate inhibition of the gas/liquid interface's ability to stimulate cellular activation mechanically.

  7. Estimating statistical uncertainty of Monte Carlo efficiency-gain in the context of a correlated sampling Monte Carlo code for brachytherapy treatment planning with non-normal dose distribution.

    PubMed

    Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr

    2012-01-01

    Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Teen Perceptions of Cellular Phones as a Communication Tool

    ERIC Educational Resources Information Center

    Jonas, Denise D.

    2011-01-01

    The excitement and interest in innovative technologies has spanned centuries. However, the invention of the cellular phone has surpassed previous technology interests, and changed the way we communicate today. Teens make up the fastest growing market of current cellular phone users. Consequently, the purpose of this study was to determine teen…

  9. Development of a new multi-modal Monte-Carlo radiotherapy planning system.

    PubMed

    Kumada, H; Nakamura, T; Komeda, M; Matsumura, A

    2009-07-01

    A new multi-modal Monte-Carlo radiotherapy planning system (developing code: JCDS-FX) is under development at Japan Atomic Energy Agency. This system builds on fundamental technologies of JCDS applied to actual boron neutron capture therapy (BNCT) trials in JRR-4. One of features of the JCDS-FX is that PHITS has been applied to particle transport calculation. PHITS is a multi-purpose particle Monte-Carlo transport code. Hence application of PHITS enables to evaluate total doses given to a patient by a combined modality therapy. Moreover, JCDS-FX with PHITS can be used for the study of accelerator based BNCT. To verify calculation accuracy of the JCDS-FX, dose evaluations for neutron irradiation of a cylindrical water phantom and for an actual clinical trial were performed, then the results were compared with calculations by JCDS with MCNP. The verification results demonstrated that JCDS-FX is applicable to BNCT treatment planning in practical use.

  10. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  11. SU-F-T-619: Dose Evaluation of Specific Patient Plans Based On Monte Carlo Algorithm for a CyberKnife Stereotactic Radiosurgery System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piao, J; PLA 302 Hospital, Beijing; Xu, S

    2016-06-15

    Purpose: This study will use Monte Carlo to simulate the Cyberknife system, and intend to develop the third-party tool to evaluate the dose verification of specific patient plans in TPS. Methods: By simulating the treatment head using the BEAMnrc and DOSXYZnrc software, the comparison between the calculated and measured data will be done to determine the beam parameters. The dose distribution calculated in the Raytracing, Monte Carlo algorithms of TPS (Multiplan Ver4.0.2) and in-house Monte Carlo simulation method for 30 patient plans, which included 10 head, lung and liver cases in each, were analyzed. The γ analysis with the combinedmore » 3mm/3% criteria would be introduced to quantitatively evaluate the difference of the accuracy between three algorithms. Results: More than 90% of the global error points were less than 2% for the comparison of the PDD and OAR curves after determining the mean energy and FWHM.The relative ideal Monte Carlo beam model had been established. Based on the quantitative evaluation of dose accuracy for three algorithms, the results of γ analysis shows that the passing rates (84.88±9.67% for head,98.83±1.05% for liver,98.26±1.87% for lung) of PTV in 30 plans between Monte Carlo simulation and TPS Monte Carlo algorithms were good. And the passing rates (95.93±3.12%,99.84±0.33% in each) of PTV in head and liver plans between Monte Carlo simulation and TPS Ray-tracing algorithms were also good. But the difference of DVHs in lung plans between Monte Carlo simulation and Ray-tracing algorithms was obvious, and the passing rate (51.263±38.964%) of γ criteria was not good. It is feasible that Monte Carlo simulation was used for verifying the dose distribution of patient plans. Conclusion: Monte Carlo simulation algorithm developed in the CyberKnife system of this study can be used as a reference tool for the third-party tool, which plays an important role in dose verification of patient plans. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11275105). Thanks for the support from Accuray Corp.« less

  12. Cellular uptake and in vitro antitumor efficacy of composite liposomes for neutron capture therapy.

    PubMed

    Peters, Tanja; Grunewald, Catrin; Blaickner, Matthias; Ziegner, Markus; Schütz, Christian; Iffland, Dorothee; Hampel, Gabriele; Nawroth, Thomas; Langguth, Peter

    2015-02-22

    Neutron capture therapy for glioblastoma has focused mainly on the use of (10)B as neutron capture isotope. However, (157)Gd offers several advantages over boron, such as higher cross section for thermal neutrons and the possibility to perform magnetic resonance imaging during neutron irradiation, thereby combining therapy and diagnostics. We have developed different liposomal formulations of gadolinium-DTPA (Magnevist®) for application in neutron capture therapy of glioblastoma. The formulations were characterized physicochemically and tested in vitro in a glioma cell model for their effectiveness. Liposomes entrapping gadolinium-DTPA as neutron capture agent were manufactured via lipid/film-extrusion method and characterized with regard to size, entrapment efficiency and in vitro release. For neutron irradiation, F98 and LN229 glioma cells were incubated with the newly developed liposomes and subsequently irradiated at the thermal column of the TRIGA reactor in Mainz. The dose rate derived from neutron irradiation with (157)Gd as neutron capturing agent was calculated via Monte Carlo simulations and set in relation to the respective cell survival. The liposomal Gd-DTPA reduced cell survival of F98 and LN229 cells significantly. Differences in liposomal composition of the formulations led to distinctly different outcome in cell survival. The amount of cellular Gd was not at all times proportional to cell survival, indicating that intracellular deposition of formulated Gd has a major influence on cell survival. The majority of the dose contribution arises from photon cross irradiation compared to a very small Gd-related dose. Liposomal gadolinium formulations represent a promising approach for neutron capture therapy of glioblastoma cells. The liposome composition determines the uptake and the survival of cells following radiation, presumably due to different uptake pathways of liposomes and intracellular deposition of gadolinium-DTPA. Due to the small range of the Auger and conversion electrons produced in (157)Gd capture, the proximity of Gd-atoms to cellular DNA is a crucial factor for infliction of lethal damage. Furthermore, Gd-containing liposomes may be used as MRI contrast agents for diagnostic purposes and surveillance of tumor targeting, thus enabling a theranostic approach for tumor therapy.

  13. SU-F-BRD-07: Fast Monte Carlo-Based Biological Optimization of Proton Therapy Treatment Plans for Thyroid Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan Chan Tseung, H; Ma, J; Ma, D

    2015-06-15

    Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based biological planning for the treatment of thyroid tumors in spot-scanning proton therapy. Methods: Recently, we developed a fast and accurate GPU-based MC simulation of proton transport that was benchmarked against Geant4.9.6 and used as the dose calculation engine in a clinically-applicable GPU-accelerated IMPT optimizer. Besides dose, it can simultaneously score the dose-averaged LET (LETd), which makes fast biological dose (BD) estimates possible. To convert from LETd to BD, we used a linear relation based on cellular irradiation data. Given a thyroid patient with a 93cc tumor volume, we createdmore » a 2-field IMPT plan in Eclipse (Varian Medical Systems). This plan was re-calculated with our MC to obtain the BD distribution. A second 5-field plan was made with our in-house optimizer, using pre-generated MC dose and LETd maps. Constraints were placed to maintain the target dose to within 25% of the prescription, while maximizing the BD. The plan optimization and calculation of dose and LETd maps were performed on a GPU cluster. The conventional IMPT and biologically-optimized plans were compared. Results: The mean target physical and biological doses from our biologically-optimized plan were, respectively, 5% and 14% higher than those from the MC re-calculation of the IMPT plan. Dose sparing to critical structures in our plan was also improved. The biological optimization, including the initial dose and LETd map calculations, can be completed in a clinically viable time (∼30 minutes) on a cluster of 25 GPUs. Conclusion: Taking advantage of GPU acceleration, we created a MC-based, biologically optimized treatment plan for a thyroid patient. Compared to a standard IMPT plan, a 5% increase in the target’s physical dose resulted in ∼3 times as much increase in the BD. Biological planning was thus effective in escalating the target BD.« less

  14. Increased dose near the skin due to electromagnetic surface beacon transponder.

    PubMed

    Ahn, Kang-Hyun; Manger, Ryan; Halpern, Howard J; Aydogan, Bulent

    2015-05-08

    The purpose of this study was to evaluate the increased dose near the skin from an electromagnetic surface beacon transponder, which is used for localization and tracking organ motion. The bolus effect due to the copper coil surface beacon was evaluated with radiographic film measurements and Monte Carlo simulations. Various beam incidence angles were evaluated for both 6 MV and 18 MV experimentally. We performed simulations using a general-purpose Monte Carlo code MCNPX (Monte Carlo N-Particle) to supplement the experimental data. We modeled the surface beacon geometry using the actual mass of the glass vial and copper coil placed in its L-shaped polyethylene terephthalate tubing casing. Film dosimetry measured factors of 2.2 and 3.0 enhancement in the surface dose for normally incident 6 MV and 18 MV beams, respectively. Although surface dose further increased with incidence angle, the relative contribution from the bolus effect was reduced at the oblique incidence. The enhancement factors were 1.5 and 1.8 for 6 MV and 18 MV, respectively, at an incidence angle of 60°. Monte Carlo simulation confirmed the experimental results and indicated that the epidermal skin dose can reach approximately 50% of the dose at dmax at normal incidence. The overall effect could be acceptable considering the skin dose enhancement is confined to a small area (~ 1 cm2), and can be further reduced by using an opposite beam technique. Further clinical studies are justified in order to study the dosimetric benefit versus possible cosmetic effects of the surface beacon. One such clinical situation would be intact breast radiation therapy, especially large-breasted women.

  15. Monte Carlo dose distribution calculation at nuclear level for Auger-emitting radionuclide energies.

    PubMed

    Di Maria, S; Belchior, A; Romanets, Y; Paulo, A; Vaz, P

    2018-05-01

    The distribution of radiopharmaceuticals in tumor cells represents a fundamental aspect for a successful molecular targeted radiotherapy. It was largely demonstrated at microscopic level that only a fraction of cells in tumoral tissues incorporate the radiolabel. In addition, the distribution of the radionuclides at sub-cellular level, namely inside each nucleus, should also be investigated for accurate dosimetry estimation. The most used method to perform cellular dosimetry is the MIRD one, where S-values are able to estimate cellular absorbed doses for several electron energies, nucleus diameters, and considering homogeneous source distributions. However the radionuclide distribution inside nuclei can be also highly non-homogeneous. The aim of this study is to show in what extent a non-accurate cellular dosimetry could lead to misinterpretations of surviving cell fraction vs dose relationship; in this context, a dosimetric case study with 99m Tc is also presented. The state-of-art MCNP6 Monte Carlo simulation was used in order to model cell structures both in MIRD geometry (MG) and MIRD modified geometries (MMG), where also entire mitotic chromosome volumes were considered (each structure was modeled as liquid water material). In order to simulate a wide energy range of Auger emitting radionuclides, four mono energetic electron emissions were considered, namely 213eV, 6keV, 11keV and 20keV. A dosimetric calculation for 99m Tc undergoing inhomogeneous nuclear internalization was also performed. After a successful validation step between MIRD and our computed S-values for three Auger-emitting radionuclides ( 99m Tc, 125 I and 64 Cu), absorbed dose results showed that the standard MG could differ from the MMG from one to three orders of magnitude. These results were also confirmed by considering the 99m Tc spectrum emission (Auger and internal conversion electrons). Moreover, considering an inhomogeneous radionuclide distribution, the average electron energy that maximizes the absorbed dose was found to be different for MG and MMG. The modeling of realistic radionuclide localization inside cells, including a inhomogeneous nuclear distribution, revealed that i) a strong bias in surviving cell fraction vs dose relationships (taking to different radiobiological models) can arise; ii) the alternative models might contribute to a more accurate prediction of the radiobiological effects inherent to more specific molecular targeted radiotherapy strategies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. A self-modifying cellular automaton model of historical urbanization in the San Francisco Bay area

    USGS Publications Warehouse

    Clarke, K.C.; Hoppen, S.; Gaydos, L.

    1997-01-01

    In this paper we describe a cellular automaton (CA) simulation model developed to predict urban growth as part of a project for estimating the regional and broader impact of urbanization on the San Francisco Bay area's climate. The rules of the model are more complex than those of a typical CA and involve the use of multiple data sources, including topography, road networks, and existing settlement distributions, and their modification over time. In addition, the control parameters of the model are allowed to self-modify: that is, the CA adapts itself to the circumstances it generates, in particular, during periods of rapid growth or stagnation. In addition, the model was written to allow the accumulation of probabilistic estimates based on Monte Carlo methods. Calibration of the model has been accomplished by the use of historical maps to compare model predictions of urbanization, based solely upon the distribution in year 1900, with observed data for years 1940, 1954, 1962, 1974, and 1990. The complexity of this model has made calibration a particularly demanding step. Lessons learned about the methods, measures, and strategies developed to calibrate the model may be of use in other environmental modeling contexts. With the calibration complete, the model is being used to generate a set of future scenarios for the San Francisco Bay area along with their probabilities based on the Monte Carlo version of the model. Animated dynamic mapping of the simulations will be used to allow visualization of the impact of future urban growth.

  17. Monte Carlo simulations for angular and spatial distributions in therapeutic-energy proton beams

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Pan, C. Y.; Chiang, K. J.; Yuan, M. C.; Chu, C. H.; Tsai, Y. W.; Teng, P. K.; Lin, C. H.; Chao, T. C.; Lee, C. C.; Tung, C. J.; Chen, A. E.

    2017-11-01

    The purpose of this study is to compare the angular and spatial distributions of therapeutic-energy proton beams obtained from the FLUKA, GEANT4 and MCNP6 Monte Carlo codes. The Monte Carlo simulations of proton beams passing through two thin targets and a water phantom were investigated to compare the primary and secondary proton fluence distributions and dosimetric differences among these codes. The angular fluence distributions, central axis depth-dose profiles, and lateral distributions of the Bragg peak cross-field were calculated to compare the proton angular and spatial distributions and energy deposition. Benchmark verifications from three different Monte Carlo simulations could be used to evaluate the residual proton fluence for the mean range and to estimate the depth and lateral dose distributions and the characteristic depths and lengths along the central axis as the physical indices corresponding to the evaluation of treatment effectiveness. The results showed a general agreement among codes, except that some deviations were found in the penumbra region. These calculated results are also particularly helpful for understanding primary and secondary proton components for stray radiation calculation and reference proton standard determination, as well as for determining lateral dose distribution performance in proton small-field dosimetry. By demonstrating these calculations, this work could serve as a guide to the recent field of Monte Carlo methods for therapeutic-energy protons.

  18. 47 CFR 27.2 - Permissible communications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... bands. Operators in the 775-776 MHz and 805-806 MHz bands may not employ a cellular system architecture. A cellular system architecture is defined, for purposes of this part, as one that consists of many...

  19. 47 CFR 27.2 - Permissible communications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... bands. Operators in the 775-776 MHz and 805-806 MHz bands may not employ a cellular system architecture. A cellular system architecture is defined, for purposes of this part, as one that consists of many...

  20. Poster — Thur Eve — 14: Improving Tissue Segmentation for Monte Carlo Dose Calculation using DECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Salvio, A.; Bedwani, S.; Carrier, J-F.

    2014-08-15

    Purpose: To improve Monte Carlo dose calculation accuracy through a new tissue segmentation technique with dual energy CT (DECT). Methods: Electron density (ED) and effective atomic number (EAN) can be extracted directly from DECT data with a stoichiometric calibration method. Images are acquired with Monte Carlo CT projections using the user code egs-cbct and reconstructed using an FDK backprojection algorithm. Calibration is performed using projections of a numerical RMI phantom. A weighted parameter algorithm then uses both EAN and ED to assign materials to voxels from DECT simulated images. This new method is compared to a standard tissue characterization frommore » single energy CT (SECT) data using a segmented calibrated Hounsfield unit (HU) to ED curve. Both methods are compared to the reference numerical head phantom. Monte Carlo simulations on uniform phantoms of different tissues using dosxyz-nrc show discrepancies in depth-dose distributions. Results: Both SECT and DECT segmentation methods show similar performance assigning soft tissues. Performance is however improved with DECT in regions with higher density, such as bones, where it assigns materials correctly 8% more often than segmentation with SECT, considering the same set of tissues and simulated clinical CT images, i.e. including noise and reconstruction artifacts. Furthermore, Monte Carlo results indicate that kV photon beam depth-dose distributions can double between two tissues of density higher than muscle. Conclusions: A direct acquisition of ED and the added information of EAN with DECT data improves tissue segmentation and increases the accuracy of Monte Carlo dose calculation in kV photon beams.« less

  1. Concept of Fractal Dimension use of Multifractal Cloud Liquid Models Based on Real Data as Input to Monte Carlo Radiation Models

    NASA Technical Reports Server (NTRS)

    Wiscombe, W.

    1999-01-01

    The purpose of this paper is discuss the concept of fractal dimension; multifractal statistics as an extension of this; the use of simple multifractal statistics (power spectrum, structure function) to characterize cloud liquid water data; and to understand the use of multifractal cloud liquid water models based on real data as input to Monte Carlo radiation models of shortwave radiation transfer in 3D clouds, and the consequences of this in two areas: the design of aircraft field programs to measure cloud absorptance; and the explanation of the famous "Landsat scale break" in measured radiance.

  2. Stochastic Analysis of Orbital Lifetimes of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sasamoto, Washito; Goodliff, Kandyce; Cornelius, David

    2008-01-01

    A document discusses (1) a Monte-Carlo-based methodology for probabilistic prediction and analysis of orbital lifetimes of spacecraft and (2) Orbital Lifetime Monte Carlo (OLMC)--a Fortran computer program, consisting of a previously developed long-term orbit-propagator integrated with a Monte Carlo engine. OLMC enables modeling of variances of key physical parameters that affect orbital lifetimes through the use of probability distributions. These parameters include altitude, speed, and flight-path angle at insertion into orbit; solar flux; and launch delays. The products of OLMC are predicted lifetimes (durations above specified minimum altitudes) for the number of user-specified cases. Histograms generated from such predictions can be used to determine the probabilities that spacecraft will satisfy lifetime requirements. The document discusses uncertainties that affect modeling of orbital lifetimes. Issues of repeatability, smoothness of distributions, and code run time are considered for the purpose of establishing values of code-specific parameters and number of Monte Carlo runs. Results from test cases are interpreted as demonstrating that solar-flux predictions are primary sources of variations in predicted lifetimes. Therefore, it is concluded, multiple sets of predictions should be utilized to fully characterize the lifetime range of a spacecraft.

  3. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badal, Andreu; Badano, Aldo

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less

  4. Experimental validation of a Monte Carlo proton therapy nozzle model incorporating magnetically steered protons.

    PubMed

    Peterson, S W; Polf, J; Bues, M; Ciangaru, G; Archambault, L; Beddar, S; Smith, A

    2009-05-21

    The purpose of this study is to validate the accuracy of a Monte Carlo calculation model of a proton magnetic beam scanning delivery nozzle developed using the Geant4 toolkit. The Monte Carlo model was used to produce depth dose and lateral profiles, which were compared to data measured in the clinical scanning treatment nozzle at several energies. Comparisons were also made between measured and simulated off-axis profiles to test the accuracy of the model's magnetic steering. Comparison of the 80% distal dose fall-off values for the measured and simulated depth dose profiles agreed to within 1 mm for the beam energies evaluated. Agreement of the full width at half maximum values for the measured and simulated lateral fluence profiles was within 1.3 mm for all energies. The position of measured and simulated spot positions for the magnetically steered beams agreed to within 0.7 mm of each other. Based on these results, we found that the Geant4 Monte Carlo model of the beam scanning nozzle has the ability to accurately predict depth dose profiles, lateral profiles perpendicular to the beam axis and magnetic steering of a proton beam during beam scanning proton therapy.

  5. Algorithm for repairing the damaged images of grain structures obtained from the cellular automata and measurement of grain size

    NASA Astrophysics Data System (ADS)

    Ramírez-López, A.; Romero-Romo, M. A.; Muñoz-Negron, D.; López-Ramírez, S.; Escarela-Pérez, R.; Duran-Valencia, C.

    2012-10-01

    Computational models are developed to create grain structures using mathematical algorithms based on the chaos theory such as cellular automaton, geometrical models, fractals, and stochastic methods. Because of the chaotic nature of grain structures, some of the most popular routines are based on the Monte Carlo method, statistical distributions, and random walk methods, which can be easily programmed and included in nested loops. Nevertheless, grain structures are not well defined as the results of computational errors and numerical inconsistencies on mathematical methods. Due to the finite definition of numbers or the numerical restrictions during the simulation of solidification, damaged images appear on the screen. These images must be repaired to obtain a good measurement of grain geometrical properties. Some mathematical algorithms were developed to repair, measure, and characterize grain structures obtained from cellular automata in the present work. An appropriate measurement of grain size and the corrected identification of interfaces and length are very important topics in materials science because they are the representation and validation of mathematical models with real samples. As a result, the developed algorithms are tested and proved to be appropriate and efficient to eliminate the errors and characterize the grain structures.

  6. 47 CFR 27.2 - Permissible communications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... Operators in the 775-776 MHz and 805-806 MHz bands may not employ a cellular system architecture. A cellular system architecture is defined, for purposes of this part, as one that consists of many small areas or...

  7. 47 CFR 27.2 - Permissible communications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... Operators in the 775-776 MHz and 805-806 MHz bands may not employ a cellular system architecture. A cellular system architecture is defined, for purposes of this part, as one that consists of many small areas or...

  8. 21 CFR 1271.1 - What are the purpose and scope of this part?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... HUMAN CELLS, TISSUES, AND CELLULAR AND TISSUE-BASED PRODUCTS General Provisions § 1271.1 What are the... listing system for establishments that manufacture human cells, tissues, and cellular and tissue-based...

  9. 21 CFR 1271.1 - What are the purpose and scope of this part?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... HUMAN CELLS, TISSUES, AND CELLULAR AND TISSUE-BASED PRODUCTS General Provisions § 1271.1 What are the... listing system for establishments that manufacture human cells, tissues, and cellular and tissue-based...

  10. EXPERIMENTAL AND MONTE CARLO INVESTIGATIONS OF BCF-12 SMALL‑AREA PLASTIC SCINTILLATION DETECTORS FOR NEUTRON PINHOLE CAMERA.

    PubMed

    Bielecki, J; Drozdowicz, K; Dworak, D; Igielski, A; Janik, W; Kulinska, A; Marciniak, L; Scholz, M; Turzanski, M; Wiacek, U; Woznicka, U; Wójcik-Gargula, A

    2017-12-11

    Plastic organic scintillators such as the blue-emitting BCF-12 are versatile and inexpensive tools. Recently, BCF-12 scintillators have been foreseen for investigation of the spatial distribution of neutrons emitted from dense magnetized plasma. For this purpose, small-area (5 mm × 5 mm) detectors based on BCF-12 scintillation rods and Hamamatsu photomultiplier tubes were designed and constructed at the Institute of Nuclear Physics. They will be located inside the neutron pinhole camera of the PF-24 plasma focus device. Two different geometrical layouts and approaches to the construction of the scintillation element were tested. The aim of this work was to determine the efficiency of the detectors. For this purpose, the experimental investigations using a neutron generator and a Pu-Be source were combined with Monte Carlo computations using the Geant4 code. © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. SU-E-T-202: Impact of Monte Carlo Dose Calculation Algorithm On Prostate SBRT Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venencia, C; Garrigo, E; Cardenas, J

    2014-06-01

    Purpose: The purpose of this work was to quantify the dosimetric impact of using Monte Carlo algorithm on pre calculated SBRT prostate treatment with pencil beam dose calculation algorithm. Methods: A 6MV photon beam produced by a Novalis TX (BrainLAB-Varian) linear accelerator equipped with HDMLC was used. Treatment plans were done using 9 fields with Iplanv4.5 (BrainLAB) and dynamic IMRT modality. Institutional SBRT protocol uses a total dose to the prostate of 40Gy in 5 fractions, every other day. Dose calculation is done by pencil beam (2mm dose resolution), heterogeneity correction and dose volume constraint (UCLA) for PTV D95%=40Gy andmore » D98%>39.2Gy, Rectum V20Gy<50%, V32Gy<20%, V36Gy<10% and V40Gy<5%, Bladder V20Gy<40% and V40Gy<10%, femoral heads V16Gy<5%, penile bulb V25Gy<3cc, urethra and overlap region between PTV and PRV Rectum Dmax<42Gy. 10 SBRT treatments plans were selected and recalculated using Monte Carlo with 2mm spatial resolution and mean variance of 2%. DVH comparisons between plans were done. Results: The average difference between PTV doses constraints were within 2%. However 3 plans have differences higher than 3% which does not meet the D98% criteria (>39.2Gy) and should have been renormalized. Dose volume constraint differences for rectum, bladder, femoral heads and penile bulb were les than 2% and within tolerances. Urethra region and overlapping between PTV and PRV Rectum shows increment of dose in all plans. The average difference for urethra region was 2.1% with a maximum of 7.8% and for the overlapping region 2.5% with a maximum of 8.7%. Conclusion: Monte Carlo dose calculation on dynamic IMRT treatments could affects on plan normalization. Dose increment in critical region of urethra and PTV overlapping region with PTV could have clinical consequences which need to be studied. The use of Monte Carlo dose calculation algorithm is limited because inverse planning dose optimization use only pencil beam.« less

  12. SU-F-T-12: Monte Carlo Dosimetry of the 60Co Bebig High Dose Rate Source for Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, L T; Almeida, C E V de

    Purpose: The purpose of this work is to obtain the dosimetry parameters in accordance with the AAPM TG-43U1 formalism with Monte Carlo calculations regarding the BEBIG 60Co high-dose-rate brachytherapy. The geometric design and material details of the source was provided by the manufacturer and was used to define the Monte Carlo geometry. Methods: The dosimetry studies included the calculation of the air kerma strength Sk, collision kerma in water along the transverse axis with an unbounded phantom, dose rate constant and radial dose function. The Monte Carlo code system that was used was EGSnrc with a new cavity code, whichmore » is a part of EGS++ that allows calculating the radial dose function around the source. The XCOM photon cross-section library was used. Variance reduction techniques were used to speed up the calculation and to considerably reduce the computer time. To obtain the dose rate distributions of the source in an unbounded liquid water phantom, the source was immersed at the center of a cube phantom of 100 cm3. Results: The obtained dose rate constant for the BEBIG 60Co source was 1.108±0.001 cGyh-1U-1, which is consistent with the values in the literature. The radial dose functions were compared with the values of the consensus data set in the literature, and they are consistent with the published data for this energy range. Conclusion: The dose rate constant is consistent with the results of Granero et al. and Selvam and Bhola within 1%. Dose rate data are compared to GEANT4 and DORZnrc Monte Carlo code. However, the radial dose function is different by up to 10% for the points that are notably near the source on the transversal axis because of the high-energy photons from 60Co, which causes an electronic disequilibrium at the interface between the source capsule and the liquid water for distances up to 1 cm.« less

  13. Understanding radiation damage on sub-cellular scale using RADAMOL simulation tool

    NASA Astrophysics Data System (ADS)

    Štěpán, Václav; Davídková, Marie

    2016-11-01

    We present an overview of the biophysical model RADAMOL developed as a Monte Carlo simulation tool for physical, physico-chemical and chemical stages of ionizing radiation action. Direct and indirect radiation damage by 10 keV electrons, and protons and alpha particles with energies from 1 MeV up to 30 MeV to a free DNA oligomer or DNA in the complex with lac repressor protein is analyzed. The role of radiation type and energy, oxygen concentration and DNA interaction with proteins on yields and distributions of primary biomolecular damage is demonstrated and discussed.

  14. Comparison of internal dose estimates obtained using organ-level, voxel S value, and Monte Carlo techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimes, Joshua, E-mail: grimes.joshua@mayo.edu; Celler, Anna

    2014-09-15

    Purpose: The authors’ objective was to compare internal dose estimates obtained using the Organ Level Dose Assessment with Exponential Modeling (OLINDA/EXM) software, the voxel S value technique, and Monte Carlo simulation. Monte Carlo dose estimates were used as the reference standard to assess the impact of patient-specific anatomy on the final dose estimate. Methods: Six patients injected with{sup 99m}Tc-hydrazinonicotinamide-Tyr{sup 3}-octreotide were included in this study. A hybrid planar/SPECT imaging protocol was used to estimate {sup 99m}Tc time-integrated activity coefficients (TIACs) for kidneys, liver, spleen, and tumors. Additionally, TIACs were predicted for {sup 131}I, {sup 177}Lu, and {sup 90}Y assuming themore » same biological half-lives as the {sup 99m}Tc labeled tracer. The TIACs were used as input for OLINDA/EXM for organ-level dose calculation and voxel level dosimetry was performed using the voxel S value method and Monte Carlo simulation. Dose estimates for {sup 99m}Tc, {sup 131}I, {sup 177}Lu, and {sup 90}Y distributions were evaluated by comparing (i) organ-level S values corresponding to each method, (ii) total tumor and organ doses, (iii) differences in right and left kidney doses, and (iv) voxelized dose distributions calculated by Monte Carlo and the voxel S value technique. Results: The S values for all investigated radionuclides used by OLINDA/EXM and the corresponding patient-specific S values calculated by Monte Carlo agreed within 2.3% on average for self-irradiation, and differed by as much as 105% for cross-organ irradiation. Total organ doses calculated by OLINDA/EXM and the voxel S value technique agreed with Monte Carlo results within approximately ±7%. Differences between right and left kidney doses determined by Monte Carlo were as high as 73%. Comparison of the Monte Carlo and voxel S value dose distributions showed that each method produced similar dose volume histograms with a minimum dose covering 90% of the volume (D90) agreeing within ±3%, on average. Conclusions: Several aspects of OLINDA/EXM dose calculation were compared with patient-specific dose estimates obtained using Monte Carlo. Differences in patient anatomy led to large differences in cross-organ doses. However, total organ doses were still in good agreement since most of the deposited dose is due to self-irradiation. Comparison of voxelized doses calculated by Monte Carlo and the voxel S value technique showed that the 3D dose distributions produced by the respective methods are nearly identical.« less

  15. [Study of the influence of cellular phones and personal computers on schoolchildren's health: hygienic aspects].

    PubMed

    Chernenkov, Iu V; Gumeniuk, O I

    2009-01-01

    The paper presents the results of studying the impact of using cellular phones and personal computers on the health status of 277 Saratov schoolchildren (mean age 13.2 +/- 2.3 years). About 80% of the adolescents have been ascertained to use cellular phones and computers mainly for game purposes. The active users of cellular phones and computers show a high aggressiveness, anxiety, hostility, and social stress, low stress resistance, and susceptibility to arterial hypotension. The negative influence of cellular phones and computers on the schoolchildren's health increases with the increased duration and frequency of their use.

  16. Monte Carlo simulation of Ising models by multispin coding on a vector computer

    NASA Astrophysics Data System (ADS)

    Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus

    1984-11-01

    Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.

  17. Dosimetric verification of IMRT treatment planning using Monte Carlo simulations for prostate cancer

    NASA Astrophysics Data System (ADS)

    Yang, J.; Li, J.; Chen, L.; Price, R.; McNeeley, S.; Qin, L.; Wang, L.; Xiong, W.; Ma, C.-M.

    2005-03-01

    The purpose of this work is to investigate the accuracy of dose calculation of a commercial treatment planning system (Corvus, Normos Corp., Sewickley, PA). In this study, 30 prostate intensity-modulated radiotherapy (IMRT) treatment plans from the commercial treatment planning system were recalculated using the Monte Carlo method. Dose-volume histograms and isodose distributions were compared. Other quantities such as minimum dose to the target (Dmin), the dose received by 98% of the target volume (D98), dose at the isocentre (Diso), mean target dose (Dmean) and the maximum critical structure dose (Dmax) were also evaluated based on our clinical criteria. For coplanar plans, the dose differences between Monte Carlo and the commercial treatment planning system with and without heterogeneity correction were not significant. The differences in the isocentre dose between the commercial treatment planning system and Monte Carlo simulations were less than 3% for all coplanar cases. The differences on D98 were less than 2% on average. The differences in the mean dose to the target between the commercial system and Monte Carlo results were within 3%. The differences in the maximum bladder dose were within 3% for most cases. The maximum dose differences for the rectum were less than 4% for all the cases. For non-coplanar plans, the difference in the minimum target dose between the treatment planning system and Monte Carlo calculations was up to 9% if the heterogeneity correction was not applied in Corvus. This was caused by the excessive attenuation of the non-coplanar beams by the femurs. When the heterogeneity correction was applied in Corvus, the differences were reduced significantly. These results suggest that heterogeneity correction should be used in dose calculation for prostate cancer with non-coplanar beam arrangements.

  18. WE-AB-204-11: Development of a Nuclear Medicine Dosimetry Module for the GPU-Based Monte Carlo Code ARCHER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less

  19. MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capabilitymore » of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.« less

  20. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  1. An atomic and molecular fluid model for efficient edge-plasma transport simulations at high densities

    NASA Astrophysics Data System (ADS)

    Rognlien, Thomas; Rensink, Marvin

    2016-10-01

    Transport simulations for the edge plasma of tokamaks and other magnetic fusion devices requires the coupling of plasma and recycling or injected neutral gas. There are various neutral models used for this purpose, e.g., atomic fluid model, a Monte Carlo particle models, transition/escape probability methods, and semi-analytic models. While the Monte Carlo method is generally viewed as the most accurate, it is time consuming, which becomes even more demanding for device simulations of high densities and size typical of fusion power plants because the neutral collisional mean-free path becomes very small. Here we examine the behavior of an extended fluid neutral model for hydrogen that includes both atoms and molecules, which easily includes nonlinear neutral-neutral collision effects. In addition to the strong charge-exchange between hydrogen atoms and ions, elastic scattering is included among all species. Comparisons are made with the DEGAS 2 Monte Carlo code. Work performed for U.S. DoE by LLNL under Contract DE-AC52-07NA27344.

  2. Comparison of UWCC MOX fuel measurements to MCNP-REN calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abhold, M.; Baker, M.; Jie, R.

    1998-12-31

    The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less

  3. Monte Carlo treatment planning with modulated electron radiotherapy: framework development and application

    NASA Astrophysics Data System (ADS)

    Alexander, Andrew William

    Within the field of medical physics, Monte Carlo radiation transport simulations are considered to be the most accurate method for the determination of dose distributions in patients. The McGill Monte Carlo treatment planning system (MMCTP), provides a flexible software environment to integrate Monte Carlo simulations with current and new treatment modalities. A developing treatment modality called energy and intensity modulated electron radiotherapy (MERT) is a promising modality, which has the fundamental capabilities to enhance the dosimetry of superficial targets. An objective of this work is to advance the research and development of MERT with the end goal of clinical use. To this end, we present the MMCTP system with an integrated toolkit for MERT planning and delivery of MERT fields. Delivery is achieved using an automated "few leaf electron collimator" (FLEC) and a controller. Aside from the MERT planning toolkit, the MMCTP system required numerous add-ons to perform the complex task of large-scale autonomous Monte Carlo simulations. The first was a DICOM import filter, followed by the implementation of DOSXYZnrc as a dose calculation engine and by logic methods for submitting and updating the status of Monte Carlo simulations. Within this work we validated the MMCTP system with a head and neck Monte Carlo recalculation study performed by a medical dosimetrist. The impact of MMCTP lies in the fact that it allows for systematic and platform independent large-scale Monte Carlo dose calculations for different treatment sites and treatment modalities. In addition to the MERT planning tools, various optimization algorithms were created external to MMCTP. The algorithms produced MERT treatment plans based on dose volume constraints that employ Monte Carlo pre-generated patient-specific kernels. The Monte Carlo kernels are generated from patient-specific Monte Carlo dose distributions within MMCTP. The structure of the MERT planning toolkit software and optimization algorithms are demonstrated. We investigated the clinical significance of MERT on spinal irradiation, breast boost irradiation, and a head and neck sarcoma cancer site using several parameters to analyze the treatment plans. Finally, we investigated the idea of mixed beam photon and electron treatment planning. Photon optimization treatment planning tools were included within the MERT planning toolkit for the purpose of mixed beam optimization. In conclusion, this thesis work has resulted in the development of an advanced framework for photon and electron Monte Carlo treatment planning studies and the development of an inverse planning system for photon, electron or mixed beam radiotherapy (MBRT). The justification and validation of this work is found within the results of the planning studies, which have demonstrated dosimetric advantages to using MERT or MBRT in comparison to clinical treatment alternatives.

  4. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of computemore » node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.« less

  5. SU-E-T-481: Dosimetric Effects of Tissue Heterogeneity in Proton Therapy: Monte Carlo Simulation and Experimental Study Using Animal Tissue Phantoms.

    PubMed

    Liu, Y; Zheng, Y

    2012-06-01

    Accurate determination of proton dosimetric effect for tissue heterogeneity is critical in proton therapy. Proton beams have finite range and consequently tissue heterogeneity plays a more critical role in proton therapy. The purpose of this study is to investigate the tissue heterogeneity effect in proton dosimetry based on anatomical-based Monte Carlo simulation using animal tissues. Animal tissues including a pig head and beef bulk were used in this study. Both pig head and beef were scanned using a GE CT scanner with 1.25 mm slice thickness. A treatment plan was created, using the CMS XiO treatment planning system (TPS) with a single proton spread-out-Bragg-peak beam (SOBP). Radiochromic films were placed at the distal falloff region. Image guidance was used to align the phantom before proton beams were delivered according to the treatment plan. The same two CT sets were converted to Monte Carlo simulation model. The Monte Carlo simulated dose calculations with/without tissue omposition were compared to TPS calculations and measurements. Based on the preliminary comparison, at the center of SOBP plane, the Monte Carlo simulation dose without tissue composition agreed generally well with TPS calculation. In the distal falloff region, the dose difference was large, and about 2 mm isodose line shift was observed with the consideration of tissue composition. The detailed comparison of dose distributions between Monte Carlo simulation, TPS calculations and measurements is underway. Accurate proton dose calculations are challenging in proton treatment planning for heterogeneous tissues. Tissue heterogeneity and tissue composition may lead to isodose line shifts up to a few millimeters in the distal falloff region. By simulating detailed particle transport and energy deposition, Monte Carlo simulations provide a verification method in proton dose calculation where inhomogeneous tissues are present. © 2012 American Association of Physicists in Medicine.

  6. SU-F-T-657: In-Room Neutron Dose From High Energy Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christ, D; Ding, G

    Purpose: To estimate neutron dose inside the treatment room from photodisintegration events in high energy photon beams using Monte Carlo simulations and experimental measurements. Methods: The Monte Carlo code MCNP6 was used for the simulations. An Eberline ESP-1 Smart Portable Neutron Detector was used to measure neutron dose. A water phantom was centered at isocenter on the treatment couch, and the detector was placed near the phantom. A Varian 2100EX linear accelerator delivered an 18MV open field photon beam to the phantom at 400MU/min, and a camera captured the detector readings. The experimental setup was modeled in the Monte Carlomore » simulation. The source was modeled for two extreme cases: a) hemispherical photon source emitting from the target and b) cone source with an angle of the primary collimator cone. The model includes the target, primary collimator, flattening filter, secondary collimators, water phantom, detector and concrete walls. Energy deposition tallies were measured for neutrons in the detector and for photons at the center of the phantom. Results: For an 18MV beam with an open 10cm by 10cm field and the gantry at 180°, the Monte Carlo simulations predict the neutron dose in the detector to be 0.11% of the photon dose in the water phantom for case a) and 0.01% for case b). The measured neutron dose is 0.04% of the photon dose. Considering the range of neutron dose predicted by Monte Carlo simulations, the calculated results are in good agreement with measurements. Conclusion: We calculated in-room neutron dose by using Monte Carlo techniques, and the predicted neutron dose is confirmed by experimental measurements. If we remodel the source as an electron beam hitting the target for a more accurate representation of the bremsstrahlung fluence, it is feasible that the Monte Carlo simulations can be used to help in shielding designs.« less

  7. In-silico analysis on biofabricating vascular networks using kinetic Monte Carlo simulations.

    PubMed

    Sun, Yi; Yang, Xiaofeng; Wang, Qi

    2014-03-01

    We present a computational modeling approach to study the fusion of multicellular aggregate systems in a novel scaffold-less biofabrication process, known as 'bioprinting'. In this novel technology, live multicellular aggregates are used as fundamental building blocks to make tissues or organs (collectively known as the bio-constructs,) via the layer-by-layer deposition technique or other methods; the printed bio-constructs embedded in maturogens, consisting of nutrient-rich bio-compatible hydrogels, are then placed in bioreactors to undergo the cellular aggregate fusion process to form the desired functional bio-structures. Our approach reported here is an agent-based modeling method, which uses the kinetic Monte Carlo (KMC) algorithm to evolve the cellular system on a lattice. In this method, the cells and the hydrogel media, in which cells are embedded, are coarse-grained to material's points on a three-dimensional (3D) lattice, where the cell-cell and cell-medium interactions are quantified by adhesion and cohesion energies. In a multicellular aggregate system with a fixed number of cells and fixed amount of hydrogel media, where the effect of cell differentiation, proliferation and death are tactically neglected, the interaction energy is primarily dictated by the interfacial energy between cell and cell as well as between cell and medium particles on the lattice, respectively, based on the differential adhesion hypothesis. By using the transition state theory to track the time evolution of the multicellular system while minimizing the interfacial energy, KMC is shown to be an efficient time-dependent simulation tool to study the evolution of the multicellular aggregate system. In this study, numerical experiments are presented to simulate fusion and cell sorting during the biofabrication process of vascular networks, in which the bio-constructs are fabricated via engineering designs. The results predict the feasibility of fabricating the vascular structures via the bioprinting technology and demonstrate the morphological development process during cellular aggregate fusion in various engineering designed structures. The study also reveals that cell sorting will perhaps not significantly impact the final fabricated products, should the maturation process be well-controlled in bioprinting.

  8. Evaluated teletherapy source library

    DOEpatents

    Cox, Lawrence J.; Schach Von Wittenau, Alexis E.

    2000-01-01

    The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.

  9. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Suerfu, B.; Xu, J.; Ivantchenko, V.; Mantero, A.; Brown, J. M. C.; Bernal, M. A.; Francis, Z.; Karamitros, M.; Tran, H. N.

    2016-04-01

    A revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research and other low energy physics fields.

  10. Time dependent worldwide distribution of atmospheric neutrons and of their products. I, II, III.

    NASA Technical Reports Server (NTRS)

    Merker, M.; Light, E. S.; Verschell, H. J.; Mendell, R. B.; Korff, S. A.

    1973-01-01

    Review of the experimental results obtained in a series of measurements of the fast neutron cosmic ray spectrum by means of high-altitude balloons and aircraft. These results serve as a basis for checking a Monte Carlo calculation of the entire neutron distribution and its products. A calculation of neutron production and transport in the earth's atmosphere is then discussed for the purpose of providing a detailed description of the morphology of secondary neutron components. Finally, an analysis of neutron observations during solar particle events is presented. The Monte Carlo output is used to estimate the contribution of flare particles to fluctuations in the steady state neutron distributions.

  11. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  12. Correlated Production and Analog Transport of Fission Neutrons and Photons using Fission Models FREYA, FIFRELIN and the Monte Carlo Code TRIPOLI-4® .

    NASA Astrophysics Data System (ADS)

    Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier

    2018-01-01

    Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.

  13. TU-F-CAMPUS-T-05: A Cloud-Based Monte Carlo Dose Calculation for Electron Cutout Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, T; Bush, K

    Purpose: For electron cutouts of smaller sizes, it is necessary to verify electron cutout factors due to perturbations in electron scattering. Often, this requires a physical measurement using a small ion chamber, diode, or film. The purpose of this study is to develop a fast Monte Carlo based dose calculation framework that requires only a smart phone photograph of the cutout and specification of the SSD and energy to determine the electron cutout factor, with the ultimate goal of making this cloud-based calculation widely available to the medical physics community. Methods: The algorithm uses a pattern recognition technique to identifymore » the corners of the cutout in the photograph as shown in Figure 1. It then corrects for variations in perspective, scaling, and translation of the photograph introduced by the user’s positioning of the camera. Blob detection is used to identify the portions of the cutout which comprise the aperture and the portions which are cutout material. This information is then used define physical densities of the voxels used in the Monte Carlo dose calculation algorithm as shown in Figure 2, and select a particle source from a pre-computed library of phase-spaces scored above the cutout. The electron cutout factor is obtained by taking a ratio of the maximum dose delivered with the cutout in place to the dose delivered under calibration/reference conditions. Results: The algorithm has been shown to successfully identify all necessary features of the electron cutout to perform the calculation. Subsequent testing will be performed to compare the Monte Carlo results with a physical measurement. Conclusion: A simple, cloud-based method of calculating electron cutout factors could eliminate the need for physical measurements and substantially reduce the time required to properly assure accurate dose delivery.« less

  14. Assessment of Different Strategies to Determine MAP-specific Cellular Immune Responses in Cattle

    USDA-ARS?s Scientific Manuscript database

    Assessment of cellular immunity in cattle against Mycobacterium avium ssp. paratuberculosis (MAP) by established methods remains unsatisfactory for diagnostic purposes. Recent studies conclude that analysis of T-cell subset responsiveness may improve diagnostic outcome. Aim of this study was to iden...

  15. FCS diffusion laws in two-phase lipid membranes: determination of domain mean size by experiments and Monte Carlo simulations.

    PubMed

    Favard, Cyril; Wenger, Jérôme; Lenne, Pierre-François; Rigneault, Hervé

    2011-03-02

    Many efforts have been undertaken over the last few decades to characterize the diffusion process in model and cellular lipid membranes. One of the techniques developed for this purpose, fluorescence correlation spectroscopy (FCS), has proved to be a very efficient approach, especially if the analysis is extended to measurements on different spatial scales (referred to as FCS diffusion laws). In this work, we examine the relevance of FCS diffusion laws for probing the behavior of a pure lipid and a lipid mixture at temperatures below, within and above the phase transitions, both experimentally and numerically. The accuracy of the microscopic description of the lipid mixtures found here extends previous work to a more complex model in which the geometry is unknown and the molecular motion is driven only by the thermodynamic parameters of the system itself. For multilamellar vesicles of both pure lipid and lipid mixtures, the FCS diffusion laws recorded at different temperatures exhibit large deviations from pure Brownian motion and reveal the existence of nanodomains. The variation of the mean size of these domains with temperature is in perfect correlation with the enthalpy fluctuation. This study highlights the advantages of using FCS diffusion laws in complex lipid systems to describe their temporal and spatial structure. Copyright © 2011 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Monte Carlo Simulations for Homeland Security Using Anthropomorphic Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Kimberly A.

    A radiological dispersion device (RDD) is a device which deliberately releases radioactive material for the purpose of causing terror or harm. In the event that a dirty bomb is detonated, there may be airborne radioactive material that can be inhaled as well as settle on an individuals leading to external contamination.

  17. Effects of Missing Data Methods in Structural Equation Modeling with Nonnormal Longitudinal Data

    ERIC Educational Resources Information Center

    Shin, Tacksoo; Davison, Mark L.; Long, Jeffrey D.

    2009-01-01

    The purpose of this study is to investigate the effects of missing data techniques in longitudinal studies under diverse conditions. A Monte Carlo simulation examined the performance of 3 missing data methods in latent growth modeling: listwise deletion (LD), maximum likelihood estimation using the expectation and maximization algorithm with a…

  18. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    ERIC Educational Resources Information Center

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  19. Interrater Reliability Estimators Commonly Used in Scoring Language Assessments: A Monte Carlo Investigation of Estimator Accuracy

    ERIC Educational Resources Information Center

    Morgan, Grant B.; Zhu, Min; Johnson, Robert L.; Hodge, Kari J.

    2014-01-01

    Common estimators of interrater reliability include Pearson product-moment correlation coefficients, Spearman rank-order correlations, and the generalizability coefficient. The purpose of this study was to examine the accuracy of estimators of interrater reliability when varying the true reliability, number of scale categories, and number of…

  20. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  1. Extensions of the MCNP5 and TRIPOLI4 Monte Carlo Codes for Transient Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Hoogenboom, J. Eduard; Sjenitzer, Bart L.

    2014-06-01

    To simulate reactor transients for safety analysis with the Monte Carlo method the generation and decay of delayed neutron precursors is implemented in the MCNP5 and TRIPOLI4 general purpose Monte Carlo codes. Important new variance reduction techniques like forced decay of precursors in each time interval and the branchless collision method are included to obtain reasonable statistics for the power production per time interval. For simulation of practical reactor transients also the feedback effect from the thermal-hydraulics must be included. This requires coupling of the Monte Carlo code with a thermal-hydraulics (TH) code, providing the temperature distribution in the reactor, which affects the neutron transport via the cross section data. The TH code also provides the coolant density distribution in the reactor, directly influencing the neutron transport. Different techniques for this coupling are discussed. As a demonstration a 3x3 mini fuel assembly with a moving control rod is considered for MCNP5 and a mini core existing of 3x3 PWR fuel assemblies with control rods and burnable poisons for TRIPOLI4. Results are shown for reactor transients due to control rod movement or withdrawal. The TRIPOLI4 transient calculation is started at low power and includes thermal-hydraulic feedback. The power rises about 10 decades and finally stabilises the reactor power at a much higher level than initial. The examples demonstrate that the modified Monte Carlo codes are capable of performing correct transient calculations, taking into account all geometrical and cross section detail.

  2. SU-E-T-561: Monte Carlo-Based Organ Dose Reconstruction Using Pre-Contoured Human Model for Hodgkins Lymphoma Patients Treated by Cobalt-60 External Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, J; Pelletier, C; Lee, C

    Purpose: Organ doses for the Hodgkin’s lymphoma patients treated with cobalt-60 radiation were estimated using an anthropomorphic model and Monte Carlo modeling. Methods: A cobalt-60 treatment unit modeled in the BEAMnrc Monte Carlo code was used to produce phase space data. The Monte Carlo simulation was verified with percent depth dose measurement in water at various field sizes. Radiation transport through the lung blocks were modeled by adjusting the weights of phase space data. We imported a precontoured adult female hybrid model and generated a treatment plan. The adjusted phase space data and the human model were imported to themore » XVMC Monte Carlo code for dose calculation. The organ mean doses were estimated and dose volume histograms were plotted. Results: The percent depth dose agreement between measurement and calculation in water phantom was within 2% for all field sizes. The mean organ doses of heart, left breast, right breast, and spleen for the selected case were 44.3, 24.1, 14.6 and 3.4 Gy, respectively with the midline prescription dose of 40.0 Gy. Conclusion: Organ doses were estimated for the patient group whose threedimensional images are not available. This development may open the door to more accurate dose reconstruction and estimates of uncertainties in secondary cancer risk for Hodgkin’s lymphoma patients. This work was partially supported by the intramural research program of the National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics.« less

  3. Backscatter factors and mass energy-absorption coefficient ratios for diagnostic radiology dosimetry

    NASA Astrophysics Data System (ADS)

    Benmakhlouf, Hamza; Bouchard, Hugo; Fransson, Annette; Andreo, Pedro

    2011-11-01

    Backscatter factors, B, and mass energy-absorption coefficient ratios, (μen/ρ)w, air, for the determination of the surface dose in diagnostic radiology were calculated using Monte Carlo simulations. The main purpose was to extend the range of available data to qualities used in modern x-ray techniques, particularly for interventional radiology. A comprehensive database for mono-energetic photons between 4 and 150 keV and different field sizes was created for a 15 cm thick water phantom. Backscattered spectra were calculated with the PENELOPE Monte Carlo system, scoring track-length fluence differential in energy with negligible statistical uncertainty; using the Monte Carlo computed spectra, B factors and (μen/ρ)w, air were then calculated numerically for each energy. Weighted averaging procedures were subsequently used to convolve incident clinical spectra with mono-energetic data. The method was benchmarked against full Monte Carlo calculations of incident clinical spectra obtaining differences within 0.3-0.6%. The technique used enables the calculation of B and (μen/ρ)w, air for any incident spectrum without further time-consuming Monte Carlo simulations. The adequacy of the extended dosimetry data to a broader range of clinical qualities than those currently available, while keeping consistency with existing data, was confirmed through detailed comparisons. Mono-energetic and spectra-averaged values were compared with published data, including those in ICRU Report 74 and IAEA TRS-457, finding average differences of 0.6%. Results are provided in comprehensive tables appropriated for clinical use. Additional qualities can easily be calculated using a designed GUI interface in conjunction with software to generate incident photon spectra.

  4. Insight into particle production mechanisms via angular correlations of identified particles in pp collisions at √{s}=7 TeV

    NASA Astrophysics Data System (ADS)

    Adam, J.; Adamová, D.; Aggarwal, M. M.; Aglieri Rinella, G.; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahmad, S.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; An, M.; Andrei, C.; Andrews, H. A.; Andronic, A.; Anguelov, V.; Anson, C.; Antičić, T.; Antinori, F.; Antonioli, P.; Anwar, R.; Aphecetche, L.; Appelshäuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badalà, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barioglio, L.; Barnaföldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Basu, S.; Bathen, B.; Batigne, G.; Batista Camejo, A.; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Beltran, L. G. E.; Belyaev, V.; Bencedi, G.; Beole, S.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielčík, J.; Bielčíková, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boldizsár, L.; Bombara, M.; Bonora, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buhler, P.; Buitron, S. A. I.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Caines, H.; Caliva, A.; Calvo Villar, E.; Camerini, P.; Capon, A. A.; Carena, F.; Carena, W.; Carnesecchi, F.; Castillo Castellanos, J.; Castro, A. J.; Casula, E. A. R.; Ceballos Sanchez, C.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Chibante Barroso, V.; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Chung, S. U.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Conesa Balbastre, G.; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Corrales Morales, Y.; Cortés Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crkovská, J.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, D.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; De Souza, R. D.; Degenhardt, H. F.; Deisting, A.; Deloff, A.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Di Ruzza, B.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divià, R.; Djuvsland, Ø.; Dobrin, A.; Domenicis Gimenez, D.; Dönigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Duggal, A. K.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erhardt, F.; Espagnon, B.; Esumi, S.; Eulisse, G.; Eum, J.; Evans, D.; Evdokimov, S.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernández Téllez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Francisco, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhøje, J. J.; Gagliardi, M.; Gago, A. M.; Gajdosova, K.; Gallio, M.; Galvan, C. D.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Garg, K.; Garg, P.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Ducati, M. B. Gay; Germain, M.; Ghosh, P.; Ghosh, S. K.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glässel, P.; Goméz Coral, D. M.; Gomez Ramirez, A.; Gonzalez, A. S.; Gonzalez, V.; González-Zamora, P.; Gorbunov, S.; Görlich, L.; Gotovac, S.; Grabski, V.; Graczykowski, L. K.; Graham, K. L.; Greiner, L.; Grelli, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grion, N.; Gronefeld, J. M.; Grosa, F.; Grosse-Oetringhaus, J. F.; Grosso, R.; Gruber, L.; Grull, F. R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Gupta, R.; Guzman, I. B.; Haake, R.; Hadjidakis, C.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbär, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Herrmann, F.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Hladky, J.; Horak, D.; Hosokawa, R.; Hristov, P.; Hughes, C.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Ippolitov, M.; Irfan, M.; Isakov, V.; Islam, M. S.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacak, B.; Jacazio, N.; Jacobs, P. M.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Jercic, M.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kang, J. H.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Mohisin Khan, M.; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Khatun, A.; Khuntia, A.; Kielbowicz, M. M.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, H.; Kim, J. S.; Kim, J.; Kim, M.; Kim, M.; Kim, S.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein, J.; Klein-Bösing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Králik, I.; Kravčáková, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kučera, V.; Kuhn, C.; Kuijer, P. G.; Kumar, A.; Kumar, J.; Kumar, L.; Kumar, S.; Kundu, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kushpil, S.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lapidus, K.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lavicka, R.; Lazaridis, L.; Lea, R.; Leardini, L.; Lee, S.; Lehas, F.; Lehner, S.; Lehrbach, J.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; León Monzón, I.; Lévai, P.; Li, S.; Li, X.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Litichevskyi, V.; Ljunggren, H. M.; Llope, W. J.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Loncar, P.; Lopez, X.; Torres, E. López; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lupi, M.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Cervantes, I. Maldonado; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Mao, Y.; Marchisone, M.; Mareš, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marín, A.; Markert, C.; Marquard, M.; Martin, N. A.; Martinengo, P.; Martínez, M. I.; Martínez García, G.; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Mathis, A. M.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzilli, M.; Mazzoni, M. A.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Mercado Pérez, J.; Meres, M.; Mhlanga, S.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Mishra, T.; Miśkowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Mühlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Münning, K.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Myers, C. J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Negrao De Oliveira, R. A.; Nellen, L.; Nesbo, S. V.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Ohlson, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pacik, V.; Pagano, D.; Pagano, P.; Paić, G.; Pal, S. K.; Palni, P.; Pan, J.; Pandey, A. K.; Panebianco, S.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, J.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Peng, X.; Pereira, L. G.; Pereira Da Costa, H.; Peresunko, D.; Perez Lezama, E.; Peskov, V.; Pestov, Y.; Petráček, V.; Petrov, V.; Petrovici, M.; Petta, C.; Pezzi, R. P.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Płoskoń, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Poppenborg, H.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Pozdniakov, V.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Rana, D. B.; Raniwala, R.; Raniwala, S.; Räsänen, S. S.; Rascanu, B. T.; Rathee, D.; Ratza, V.; Ravasenga, I.; Read, K. F.; Redlich, K.; Rehman, A.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rodríguez Cahuantzi, M.; Røed, K.; Rogochaya, E.; Rohr, D.; Röhrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Rubio Montero, A. J.; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Šafařík, K.; Sahlmuller, B.; Sahoo, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandoval, A.; Sarkar, D.; Sarkar, N.; Sarma, P.; Sas, M. H. P.; Scapparone, E.; Scarlassara, F.; Scharenberg, R. P.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schmidt, M. O.; Schmidt, M.; Schukraft, J.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Šefčík, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sett, P.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shangaraev, A.; Sharma, A.; Sharma, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singhal, V.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Soramel, F.; Sorensen, S.; Sozzi, F.; Spiriti, E.; Sputowska, I.; Srivastava, B. K.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Šumbera, M.; Sumowidagdo, S.; Suzuki, K.; Swain, S.; Szabo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Muñoz, G. Tejeda; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thakur, D.; Thomas, D.; Tieulent, R.; Tikhonov, A.; Timmins, A. R.; Toia, A.; Tripathy, S.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Trzeciak, B. A.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Umaka, E. N.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vande Vyvre, P.; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vázquez Doce, O.; Vechernin, V.; Veen, A. M.; Velure, A.; Vercellin, E.; Limón, S. Vergara; Vernet, R.; Vértesi, R.; Vickovic, L.; Vigolo, S.; Viinikainen, J.; Vilakazi, Z.; Villalobos Baillie, O.; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Virgili, T.; Vislavicius, V.; Vodopyanov, A.; Völkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Voscek, D.; Vranic, D.; Vrláková, J.; Wagner, B.; Wagner, J.; Wang, H.; Wang, M.; Watanabe, D.; Watanabe, Y.; Weber, M.; Weber, S. G.; Weiser, D. F.; Wessels, J. P.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Willems, G. A.; Williams, M. C. S.; Windelband, B.; Witt, W. E.; Yalcin, S.; Yang, P.; Yano, S.; Yin, Z.; Yokoyama, H.; Yoo, I.-K.; Yoon, J. H.; Yurchenko, V.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Závada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zhalov, M.; Zhang, H.; Zhang, X.; Zhang, Y.; Zhang, C.; Zhang, Z.; Zhao, C.; Zhigareva, N.; Zhou, D.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zhu, J.; Zhu, X.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zimmermann, S.; Zinovjev, G.; Zmeskal, J.

    2017-08-01

    Two-particle angular correlations were measured in pp collisions at √{s} = 7 TeV for pions, kaons, protons, and lambdas, for all particle/anti-particle combinations in the pair. Data for mesons exhibit an expected peak dominated by effects associated with mini-jets and are well reproduced by general purpose Monte Carlo generators. However, for baryon-baryon and anti-baryon-anti-baryon pairs, where both particles have the same baryon number, a near-side anti-correlation structure is observed instead of a peak. This effect is interpreted in the context of baryon production mechanisms in the fragmentation process. It currently presents a challenge to Monte Carlo models and its origin remains an open question.

  5. Wang-Landau method for calculating Rényi entropies in finite-temperature quantum Monte Carlo simulations.

    PubMed

    Inglis, Stephen; Melko, Roger G

    2013-01-01

    We implement a Wang-Landau sampling technique in quantum Monte Carlo (QMC) simulations for the purpose of calculating the Rényi entanglement entropies and associated mutual information. The algorithm converges an estimate for an analog to the density of states for stochastic series expansion QMC, allowing a direct calculation of Rényi entropies without explicit thermodynamic integration. We benchmark results for the mutual information on two-dimensional (2D) isotropic and anisotropic Heisenberg models, a 2D transverse field Ising model, and a three-dimensional Heisenberg model, confirming a critical scaling of the mutual information in cases with a finite-temperature transition. We discuss the benefits and limitations of broad sampling techniques compared to standard importance sampling methods.

  6. Self-learning Monte Carlo method and cumulative update in fermion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Junwei; Shen, Huitao; Qi, Yang

    2017-06-07

    In this study, we develop the self-learning Monte Carlo (SLMC) method, a general-purpose numerical method recently introduced to simulate many-body systems, for studying interacting fermion systems. Our method uses a highly efficient update algorithm, which we design and dub “cumulative update”, to generate new candidate configurations in the Markov chain based on a self-learned bosonic effective model. From a general analysis and a numerical study of the double exchange model as an example, we find that the SLMC with cumulative update drastically reduces the computational cost of the simulation, while remaining statistically exact. Remarkably, its computational complexity is far lessmore » than the conventional algorithm with local updates.« less

  7. Simulation of Auger electron emission from nanometer-size gold targets using the Geant4 Monte Carlo simulation toolkit

    DOE PAGES

    Incerti, S.; Suerfu, B.; Xu, J.; ...

    2016-02-16

    We report that a revised atomic deexcitation framework for the Geant4 general purpose Monte Carlo toolkit capable of simulating full Auger deexcitation cascades was implemented in June 2015 release (version 10.2 Beta). An overview of this refined framework and testing of its capabilities is presented for the irradiation of gold nanoparticles (NP) with keV photon and MeV proton beams. The resultant energy spectra of secondary particles created within and that escape the NP are analyzed and discussed. It is anticipated that this new functionality will improve and increase the use of Geant4 in the medical physics, radiobiology, nanomedicine research andmore » other low energy physics fields.« less

  8. Simulation of atomic diffusion in the Fcc NiAl system: A kinetic Monte Carlo study

    DOE PAGES

    Alfonso, Dominic R.; Tafen, De Nyago

    2015-04-28

    The atomic diffusion in fcc NiAl binary alloys was studied by kinetic Monte Carlo simulation. The environment dependent hopping barriers were computed using a pair interaction model whose parameters were fitted to relevant data derived from electronic structure calculations. Long time diffusivities were calculated and the effect of composition change on the tracer diffusion coefficients was analyzed. These results indicate that this variation has noticeable impact on the atomic diffusivities. A reduction in the mobility of both Ni and Al is demonstrated with increasing Al content. As a result, examination of the pair interaction between atoms was carried out formore » the purpose of understanding the predicted trends.« less

  9. Effective Thermal Conductivity of Spherical Particulate Nanocomposites: Comparison with Theoretical Models, Monte Carlo Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Machrafi, Hatim; Lebon, Georgy

    2014-11-01

    The purpose of this work is to study heat conduction in systems that are composed out of spherical micro-and nanoparticles dispersed in a bulk matrix. Special emphasis will be put on the dependence of the effective heat conductivity on various selected parameters as dimension and density of particles, interface interaction with the matrix. This is achieved by combining the effective medium approximation and extended irreversible thermodynamics, whose main feature is to elevate the heat flux vector to the status of independent variable. The model is illustrated by three examples: Silicium-Germanium, Silica-epoxy-resin and Copper-Silicium systems. Predictions of our model are in good agreement with other theoretical models, Monte-Carlo simulations and experimental data.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Liu, B; Liang, B

    Purpose: Current CyberKnife treatment planning system (TPS) provided two dose calculation algorithms: Ray-tracing and Monte Carlo. Ray-tracing algorithm is fast, but less accurate, and also can’t handle irregular fields since a multi-leaf collimator system was recently introduced to CyberKnife M6 system. Monte Carlo method has well-known accuracy, but the current version still takes a long time to finish dose calculations. The purpose of this paper is to develop a GPU-based fast C/S dose engine for CyberKnife system to achieve both accuracy and efficiency. Methods: The TERMA distribution from a poly-energetic source was calculated based on beam’s eye view coordinate system,more » which is GPU friendly and has linear complexity. The dose distribution was then computed by inversely collecting the energy depositions from all TERMA points along 192 collapsed-cone directions. EGSnrc user code was used to pre-calculate energy deposition kernels (EDKs) for a series of mono-energy photons The energy spectrum was reconstructed based on measured tissue maximum ratio (TMR) curve, the TERMA averaged cumulative kernels was then calculated. Beam hardening parameters and intensity profiles were optimized based on measurement data from CyberKnife system. Results: The difference between measured and calculated TMR are less than 1% for all collimators except in the build-up regions. The calculated profiles also showed good agreements with the measured doses within 1% except in the penumbra regions. The developed C/S dose engine was also used to evaluate four clinical CyberKnife treatment plans, the results showed a better dose calculation accuracy than Ray-tracing algorithm compared with Monte Carlo method for heterogeneous cases. For the dose calculation time, it takes about several seconds for one beam depends on collimator size and dose calculation grids. Conclusion: A GPU-based C/S dose engine has been developed for CyberKnife system, which was proven to be efficient and accurate for clinical purpose, and can be easily implemented in TPS.« less

  11. Nanoshells for photothermal therapy: a Monte-Carlo based numerical study of their design tolerance

    PubMed Central

    Grosges, Thomas; Barchiesi, Dominique; Kessentini, Sameh; Gréhan, Gérard; de la Chapelle, Marc Lamy

    2011-01-01

    The optimization of the coated metallic nanoparticles and nanoshells is a current challenge for biological applications, especially for cancer photothermal therapy, considering both the continuous improvement of their fabrication and the increasing requirement of efficiency. The efficiency of the coupling between illumination with such nanostructures for burning purposes depends unevenly on their geometrical parameters (radius, thickness of the shell) and material parameters (permittivities which depend on the illumination wavelength). Through a Monte-Carlo method, we propose a numerical study of such nanodevice, to evaluate tolerances (or uncertainty) on these parameters, given a threshold of efficiency, to facilitate the design of nanoparticles. The results could help to focus on the relevant parameters of the engineering process for which the absorbed energy is the most dependant. The Monte-Carlo method confirms that the best burning efficiency are obtained for hollow nanospheres and exhibit the sensitivity of the absorbed electromagnetic energy as a function of each parameter. The proposed method is general and could be applied in design and development of new embedded coated nanomaterials used in biomedicine applications. PMID:21698021

  12. Radiative transfer and spectroscopic databases: A line-sampling Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Galtier, Mathieu; Blanco, Stéphane; Dauchet, Jérémi; El Hafi, Mouna; Eymet, Vincent; Fournier, Richard; Roger, Maxime; Spiesser, Christophe; Terrée, Guillaume

    2016-03-01

    Dealing with molecular-state transitions for radiative transfer purposes involves two successive steps that both reach the complexity level at which physicists start thinking about statistical approaches: (1) constructing line-shaped absorption spectra as the result of very numerous state-transitions, (2) integrating over optical-path domains. For the first time, we show here how these steps can be addressed simultaneously using the null-collision concept. This opens the door to the design of Monte Carlo codes directly estimating radiative transfer observables from spectroscopic databases. The intermediate step of producing accurate high-resolution absorption spectra is no longer required. A Monte Carlo algorithm is proposed and applied to six one-dimensional test cases. It allows the computation of spectrally integrated intensities (over 25 cm-1 bands or the full IR range) in a few seconds, regardless of the retained database and line model. But free parameters need to be selected and they impact the convergence. A first possible selection is provided in full detail. We observe that this selection is highly satisfactory for quite distinct atmospheric and combustion configurations, but a more systematic exploration is still in progress.

  13. Angular dependence of the nanoDot OSL dosimeter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerns, James R.; Kry, Stephen F.; Sahoo, Narayan

    Purpose: Optically stimulated luminescent detectors (OSLDs) are quickly gaining popularity as passive dosimeters, with applications in medicine for linac output calibration verification, brachytherapy source verification, treatment plan quality assurance, and clinical dose measurements. With such wide applications, these dosimeters must be characterized for numerous factors affecting their response. The most abundant commercial OSLD is the InLight/OSL system from Landauer, Inc. The purpose of this study was to examine the angular dependence of the nanoDot dosimeter, which is part of the InLight system. Methods: Relative dosimeter response data were taken at several angles in 6 and 18 MV photon beams, asmore » well as a clinical proton beam. These measurements were done within a phantom at a depth beyond the build-up region. To verify the observed angular dependence, additional measurements were conducted as well as Monte Carlo simulations in MCNPX. Results: When irradiated with the incident photon beams parallel to the plane of the dosimeter, the nanoDot response was 4% lower at 6 MV and 3% lower at 18 MV than the response when irradiated with the incident beam normal to the plane of the dosimeter. Monte Carlo simulations at 6 MV showed similar results to the experimental values. Examination of the results in Monte Carlo suggests the cause as partial volume irradiation. In a clinical proton beam, no angular dependence was found. Conclusions: A nontrivial angular response of this OSLD was observed in photon beams. This factor may need to be accounted for when evaluating doses from photon beams incident from a variety of directions.« less

  14. Angular dependence of the nanoDot OSL dosimeter

    PubMed Central

    Kerns, James R.; Kry, Stephen F.; Sahoo, Narayan; Followill, David S.; Ibbott, Geoffrey S.

    2011-01-01

    Purpose: Optically stimulated luminescent detectors (OSLDs) are quickly gaining popularity as passive dosimeters, with applications in medicine for linac output calibration verification, brachytherapy source verification, treatment plan quality assurance, and clinical dose measurements. With such wide applications, these dosimeters must be characterized for numerous factors affecting their response. The most abundant commercial OSLD is the InLight∕OSL system from Landauer, Inc. The purpose of this study was to examine the angular dependence of the nanoDot dosimeter, which is part of the InLight system.Methods: Relative dosimeter response data were taken at several angles in 6 and 18 MV photon beams, as well as a clinical proton beam. These measurements were done within a phantom at a depth beyond the build-up region. To verify the observed angular dependence, additional measurements were conducted as well as Monte Carlo simulations in MCNPX.Results: When irradiated with the incident photon beams parallel to the plane of the dosimeter, the nanoDot response was 4% lower at 6 MV and 3% lower at 18 MV than the response when irradiated with the incident beam normal to the plane of the dosimeter. Monte Carlo simulations at 6 MV showed similar results to the experimental values. Examination of the results in Monte Carlo suggests the cause as partial volume irradiation. In a clinical proton beam, no angular dependence was found.Conclusions: A nontrivial angular response of this OSLD was observed in photon beams. This factor may need to be accounted for when evaluating doses from photon beams incident from a variety of directions. PMID:21858992

  15. Monte Carlo calculations of k{sub Q}, the beam quality conversion factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B. R.; Rogers, D. W. O.

    2010-11-15

    Purpose: To use EGSnrc Monte Carlo simulations to directly calculate beam quality conversion factors, k{sub Q}, for 32 cylindrical ionization chambers over a range of beam qualities and to quantify the effect of systematic uncertainties on Monte Carlo calculations of k{sub Q}. These factors are required to use the TG-51 or TRS-398 clinical dosimetry protocols for calibrating external radiotherapy beams. Methods: Ionization chambers are modeled either from blueprints or manufacturers' user's manuals. The dose-to-air in the chamber is calculated using the EGSnrc user-code egs{sub c}hamber using 11 different tabulated clinical photon spectra for the incident beams. The dose to amore » small volume of water is also calculated in the absence of the chamber at the midpoint of the chamber on its central axis. Using a simple equation, k{sub Q} is calculated from these quantities under the assumption that W/e is constant with energy and compared to TG-51 protocol and measured values. Results: Polynomial fits to the Monte Carlo calculated k{sub Q} factors as a function of beam quality expressed as %dd(10){sub x} and TPR{sub 10}{sup 20} are given for each ionization chamber. Differences are explained between Monte Carlo calculated values and values from the TG-51 protocol or calculated using the computer program used for TG-51 calculations. Systematic uncertainties in calculated k{sub Q} values are analyzed and amount to a maximum of one standard deviation uncertainty of 0.99% if one assumes that photon cross-section uncertainties are uncorrelated and 0.63% if they are assumed correlated. The largest components of the uncertainty are the constancy of W/e and the uncertainty in the cross-section for photons in water. Conclusions: It is now possible to calculate k{sub Q} directly using Monte Carlo simulations. Monte Carlo calculations for most ionization chambers give results which are comparable to TG-51 values. Discrepancies can be explained using individual Monte Carlo calculations of various correction factors which are more accurate than previously used values. For small ionization chambers with central electrodes composed of high-Z materials, the effect of the central electrode is much larger than that for the aluminum electrodes in Farmer chambers.« less

  16. Monte Carlo dose calculations of beta-emitting sources for intravascular brachytherapy: a comparison between EGS4, EGSnrc, and MCNP.

    PubMed

    Wang, R; Li, X A

    2001-02-01

    The dose parameters for the beta-particle emitting 90Sr/90Y source for intravascular brachytherapy (IVBT) have been calculated by different investigators. At a distant distance from the source, noticeable differences are seen in these parameters calculated using different Monte Carlo codes. The purpose of this work is to quantify as well as to understand these differences. We have compared a series of calculations using an EGS4, an EGSnrc, and the MCNP Monte Carlo codes. Data calculated and compared include the depth dose curve for a broad parallel beam of electrons, and radial dose distributions for point electron sources (monoenergetic or polyenergetic) and for a real 90Sr/90Y source. For the 90Sr/90Y source, the doses at the reference position (2 mm radial distance) calculated by the three code agree within 2%. However, the differences between the dose calculated by the three codes can be over 20% in the radial distance range interested in IVBT. The difference increases with radial distance from source, and reaches 30% at the tail of dose curve. These differences may be partially attributed to the different multiple scattering theories and Monte Carlo models for electron transport adopted in these three codes. Doses calculated by the EGSnrc code are more accurate than those by the EGS4. The two calculations agree within 5% for radial distance <6 mm.

  17. The structure of PX3 (X = Cl, Br, I) molecular liquids from X-ray diffraction, molecular dynamics simulations, and reverse Monte Carlo modeling.

    PubMed

    Pothoczki, Szilvia; Temleitner, László; Pusztai, László

    2014-02-07

    Synchrotron X-ray diffraction measurements have been conducted on liquid phosphorus trichloride, tribromide, and triiodide. Molecular Dynamics simulations for these molecular liquids were performed with a dual purpose: (1) to establish whether existing intermolecular potential functions can provide a picture that is consistent with diffraction data and (2) to generate reliable starting configurations for subsequent Reverse Monte Carlo modelling. Structural models (i.e., sets of coordinates of thousands of atoms) that were fully consistent with experimental diffraction information, within errors, have been prepared by means of the Reverse Monte Carlo method. Comparison with reference systems, generated by hard sphere-like Monte Carlo simulations, was also carried out to demonstrate the extent to which simple space filling effects determine the structure of the liquids (and thus, also estimating the information content of measured data). Total scattering structure factors, partial radial distribution functions and orientational correlations as a function of distances between the molecular centres have been calculated from the models. In general, more or less antiparallel arrangements of the primary molecular axes that are found to be the most favourable orientation of two neighbouring molecules. In liquid PBr3 electrostatic interactions seem to play a more important role in determining intermolecular correlations than in the other two liquids; molecular arrangements in both PCl3 and PI3 are largely driven by steric effects.

  18. Percentage depth dose evaluation in heterogeneous media using thermoluminescent dosimetry

    PubMed Central

    da Rosa, L.A.R.; Campos, L.T.; Alves, V.G.L.; Batista, D.V.S.; Facure, A.

    2010-01-01

    The purpose of this study is to investigate the influence of lung heterogeneity inside a soft tissue phantom on percentage depth dose (PDD). PDD curves were obtained experimentally using LiF:Mg,Ti (TLD‐100) thermoluminescent detectors and applying Eclipse treatment planning system algorithms Batho, modified Batho (M‐Batho or BMod), equivalent TAR (E‐TAR or EQTAR), and anisotropic analytical algorithm (AAA) for a 15 MV photon beam and field sizes of 1×1,2×2,5×5, and 10×10cm2. Monte Carlo simulations were performed using the DOSRZnrc user code of EGSnrc. The experimental results agree with Monte Carlo simulations for all irradiation field sizes. Comparisons with Monte Carlo calculations show that the AAA algorithm provides the best simulations of PDD curves for all field sizes investigated. However, even this algorithm cannot accurately predict PDD values in the lung for field sizes of 1×1 and 2×2cm2. An overdosage in the lung of about 40% and 20% is calculated by the AAA algorithm close to the interface soft tissue/lung for 1×1 and 2×2cm2 field sizes, respectively. It was demonstrated that differences of 100% between Monte Carlo results and the algorithms Batho, modified Batho, and equivalent TAR responses may exist inside the lung region for the 1×1cm2 field. PACS number: 87.55.kd

  19. SU-F-T-111: Investigation of the Attila Deterministic Solver as a Supplement to Monte Carlo for Calculating Out-Of-Field Radiotherapy Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Lee, C; Failla, G

    Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less

  20. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less

  1. SU-E-T-171: Evaluation of the Analytical Anisotropic Algorithm in a Small Finger Joint Phantom Using Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J; Owrangi, A; Jiang, R

    2014-06-01

    Purpose: This study investigated the performance of the anisotropic analytical algorithm (AAA) in dose calculation in radiotherapy concerning a small finger joint. Monte Carlo simulation (EGSnrc code) was used in this dosimetric evaluation. Methods: Heterogeneous finger joint phantom containing a vertical water layer (bone joint or cartilage) sandwiched by two bones with dimension 2 × 2 × 2 cm{sup 3} was irradiated by the 6 MV photon beams (field size = 4 × 4 cm{sup 2}). The central beam axis was along the length of the bone joint and the isocenter was set to the center of the joint. Themore » joint width and beam angle were varied from 0.5–2 mm and 0°–15°, respectively. Depth doses were calculated using the AAA and DOSXYZnrc. For dosimetric comparison and normalization, dose calculations were repeated in water phantom using the same beam geometry. Results: Our AAA and Monte Carlo results showed that the AAA underestimated the joint doses by 10%–20%, and could not predict joint dose variation with changes of joint width and beam angle. The calculated bone dose enhancement for the AAA was lower than Monte Carlo and the depth of maximum dose for the phantom was smaller than that for the water phantom. From Monte Carlo results, there was a decrease of joint dose as its width increased. This reflected the smaller the joint width, the more the bone scatter contributed to the depth dose. Moreover, the joint dose was found slightly decreased with an increase of beam angle. Conclusion: The AAA could not handle variations of joint dose well with changes of joint width and beam angle based on our finger joint phantom. Monte Carlo results showed that the joint dose decreased with increase of joint width and beam angle. This dosimetry comparison should be useful to radiation staff in radiotherapy related to small bone joint.« less

  2. Accurate Monte Carlo simulations for nozzle design, commissioning and quality assurance for a proton radiation therapy facility.

    PubMed

    Paganetti, H; Jiang, H; Lee, S Y; Kooy, H M

    2004-07-01

    Monte Carlo dosimetry calculations are essential methods in radiation therapy. To take full advantage of this tool, the beam delivery system has to be simulated in detail and the initial beam parameters have to be known accurately. The modeling of the beam delivery system itself opens various areas where Monte Carlo calculations prove extremely helpful, such as for design and commissioning of a therapy facility as well as for quality assurance verification. The gantry treatment nozzles at the Northeast Proton Therapy Center (NPTC) at Massachusetts General Hospital (MGH) were modeled in detail using the GEANT4.5.2 Monte Carlo code. For this purpose, various novel solutions for simulating irregular shaped objects in the beam path, like contoured scatterers, patient apertures or patient compensators, were found. The four-dimensional, in time and space, simulation of moving parts, such as the modulator wheel, was implemented. Further, the appropriate physics models and cross sections for proton therapy applications were defined. We present comparisons between measured data and simulations. These show that by modeling the treatment nozzle with millimeter accuracy, it is possible to reproduce measured dose distributions with an accuracy in range and modulation width, in the case of a spread-out Bragg peak (SOBP), of better than 1 mm. The excellent agreement demonstrates that the simulations can even be used to generate beam data for commissioning treatment planning systems. The Monte Carlo nozzle model was used to study mechanical optimization in terms of scattered radiation and secondary radiation in the design of the nozzles. We present simulations on the neutron background. Further, the Monte Carlo calculations supported commissioning efforts in understanding the sensitivity of beam characteristics and how these influence the dose delivered. We present the sensitivity of dose distributions in water with respect to various beam parameters and geometrical misalignments. This allows the definition of tolerances for quality assurance and the design of quality assurance procedures.

  3. Modeling the Hyperdistribution of Item Parameters To Improve the Accuracy of Recovery in Estimation Procedures.

    ERIC Educational Resources Information Center

    Matthews-Lopez, Joy L.; Hombo, Catherine M.

    The purpose of this study was to examine the recovery of item parameters in simulated Automatic Item Generation (AIG) conditions, using Markov chain Monte Carlo (MCMC) estimation methods to attempt to recover the generating distributions. To do this, variability in item and ability parameters was manipulated. Realistic AIG conditions were…

  4. Kinetic Monte Carlo Simulations and Molecular Conductance Measurements of the Bacterial Decaheme Cytochrome MtrF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byun, H. S.; Pirbadian, S.; Nakano, Aiichiro

    2014-09-05

    Microorganisms overcome the considerable hurdle of respiring extracellular solid substrates by deploying large multiheme cytochrome complexes that form 20 nanometer conduits to traffic electrons through the periplasm and across the cellular outer membrane. Here we report the first kinetic Monte Carlo simulations and single-molecule scanning tunneling microscopy (STM) measurements of the Shewanella oneidensis MR-1 outer membrane decaheme cytochrome MtrF, which can perform the final electron transfer step from cells to minerals and microbial fuel cell anodes. We find that the calculated electron transport rate through MtrF is consistent with previously reported in vitro measurements of the Shewanella Mtr complex, asmore » well as in vivo respiration rates on electrode surfaces assuming a reasonable (experimentally verified) coverage of cytochromes on the cell surface. The simulations also reveal a rich phase diagram in the overall electron occupation density of the hemes as a function of electron injection and ejection rates. Single molecule tunneling spectroscopy confirms MtrF's ability to mediate electron transport between an STM tip and an underlying Au(111) surface, but at rates higher than expected from previously calculated heme-heme electron transfer rates for solvated molecules.« less

  5. SU-F-T-149: Development of the Monte Carlo Simulation Platform Using Geant4 for Designing Heavy Ion Therapy Beam Nozzle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Jae-ik; Yoo, SeungHoon; Cho, Sungho

    Purpose: The significant issue of particle therapy such as proton and carbon ion was a accurate dose delivery from beam line to patient. For designing the complex delivery system, Monte Carlo simulation can be used for the simulation of various physical interaction in scatters and filters. In this report, we present the development of Monte Carlo simulation platform to help design the prototype of particle therapy nozzle and performed the Monte Carlo simulation using Geant4. Also we show the prototype design of particle therapy beam nozzle for Korea Heavy Ion Medical Accelerator (KHIMA) project in Korea Institute of Radiological andmore » Medical Science(KIRAMS) at Republic of Korea. Methods: We developed a simulation platform for particle therapy beam nozzle using Geant4. In this platform, the prototype nozzle design of Scanning system for carbon was simply designed. For comparison with theoretic beam optics, the beam profile on lateral distribution at isocenter is compared with Mont Carlo simulation result. From the result of this analysis, we can expected the beam spot property of KHIMA system and implement the spot size optimization for our spot scanning system. Results: For characteristics study of scanning system, various combination of the spot size from accerlator with ridge filter and beam monitor was tested as simple design for KHIMA dose delivery system. Conclusion: In this report, we presented the part of simulation platform and the characteristics study. This study is now on-going in order to develop the simulation platform including the beam nozzle and the dose verification tool with treatment planning system. This will be presented as soon as it is become available.« less

  6. SU-E-T-569: Neutron Shielding Calculation Using Analytical and Multi-Monte Carlo Method for Proton Therapy Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, S; Shin, E H; Kim, J

    2015-06-15

    Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using themore » production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.« less

  7. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    PubMed

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  8. SU-F-T-148: Are the Approximations in Analytic Semi-Empirical Dose Calculation Algorithms for Intensity Modulated Proton Therapy for Complex Heterogeneities of Head and Neck Clinically Significant?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yepes, P; UT MD Anderson Cancer Center, Houston, TX; Titt, U

    2016-06-15

    Purpose: Evaluate the differences in dose distributions between the proton analytic semi-empirical dose calculation algorithm used in the clinic and Monte Carlo calculations for a sample of 50 head-and-neck (H&N) patients and estimate the potential clinical significance of the differences. Methods: A cohort of 50 H&N patients, treated at the University of Texas Cancer Center with Intensity Modulated Proton Therapy (IMPT), were selected for evaluation of clinical significance of approximations in computed dose distributions. H&N site was selected because of the highly inhomogeneous nature of the anatomy. The Fast Dose Calculator (FDC), a fast track-repeating accelerated Monte Carlo algorithm formore » proton therapy, was utilized for the calculation of dose distributions delivered during treatment plans. Because of its short processing time, FDC allows for the processing of large cohorts of patients. FDC has been validated versus GEANT4, a full Monte Carlo system and measurements in water and for inhomogeneous phantoms. A gamma-index analysis, DVHs, EUDs, and TCP and NTCPs computed using published models were utilized to evaluate the differences between the Treatment Plan System (TPS) and FDC. Results: The Monte Carlo results systematically predict lower dose delivered in the target. The observed differences can be as large as 8 Gy, and should have a clinical impact. Gamma analysis also showed significant differences between both approaches, especially for the target volumes. Conclusion: Monte Carlo calculations with fast algorithms is practical and should be considered for the clinic, at least as a treatment plan verification tool.« less

  9. Insight into particle production mechanisms via angular correlations of identified particles in pp collisions at $$\\sqrt{\\mathrm{s}}=7$$ TeV

    DOE PAGES

    Adam, J.; Adamová, D.; Aggarwal, M. M.; ...

    2017-08-24

    We measured two-particle angular correlations in pp collisions at √s=7 TeV for pions, kaons, protons, and lambdas, for all particle/anti-particle combinations in the pair. Data for mesons exhibit an expected peak dominated by effects associated with mini-jets and are well reproduced by general purpose Monte Carlo generators. However, for baryon–baryon and anti-baryon–anti-baryon pairs, where both particles have the same baryon number, a near-side anti-correlation structure is observed instead of a peak. This effect is interpreted in the context of baryon production mechanisms in the fragmentation process. It currently presents a challenge to Monte Carlo models and its origin remains an openmore » question.« less

  10. The design and performance characteristics of a cellular logic 3-D image classification processor

    NASA Astrophysics Data System (ADS)

    Ankeney, L. A.

    1981-04-01

    The introduction of high resolution scanning laser radar systems which are capable of collecting range and reflectivity images, is predicted to significantly influence the development of processors capable of performing autonomous target classification tasks. Actively sensed range images are shown to be superior to passively collected infrared images in both image stability and information content. An illustrated tutorial introduces cellular logic (neighborhood) transformations and two and three dimensional erosion and dilation operations which are used for noise filters and geometric shape measurement. A unique 'cookbook' approach to selecting a sequence of neighborhood transformations suitable for object measurement is developed and related to false alarm rate and algorithm effectiveness measures. The cookbook design approach is used to develop an algorithm to classify objects based upon their 3-D geometrical features. A Monte Carlo performance analysis is used to demonstrate the utility of the design approach by characterizing the ability of the algorithm to classify randomly positioned three dimensional objects in the presence of additive noise, scale variations, and other forms of image distortion.

  11. Heavy vehicle driver workload assessment. Task 7B, in-cab text message system and cellular phone use by heavy vehicle drivers in a part-task driving simulator

    DOT National Transportation Integrated Search

    This report contains the results of a simulator study conducted to serve as a supplement to a National Highway Traffic Safety Administration (NHTSA) heavy vehicle driver workload field study. Its purpose was the evaluation of effects of cellular phon...

  12. A Simple Microscopy Assay to Teach the Processes of Phagocytosis and Exocytosis

    ERIC Educational Resources Information Center

    Gray, Ross; Gray, Andrew; Fite, Jessica L.; Jordan, Renee; Stark, Sarah; Naylor, Kari

    2012-01-01

    Phagocytosis and exocytosis are two cellular processes involving membrane dynamics. While it is easy to understand the purpose of these processes, it can be extremely difficult for students to comprehend the actual mechanisms. As membrane dynamics play a significant role in many cellular processes ranging from cell signaling to cell division to…

  13. The Interpretation of Cellular Transport Graphics by Students with Low and High Prior Knowledge

    ERIC Educational Resources Information Center

    Cook, Michelle; Carter, Glenda; Wiebe, Eric N.

    2008-01-01

    The purpose of this study was to examine how prior knowledge of cellular transport influenced how high school students in the USA viewed and interpreted graphic representations of this topic. The participants were Advanced Placement Biology students (n = 65); each participant had previously taken a biology course in high school. After assessing…

  14. Comparison of Humoral and Cellular Immune Responses to Inactivated Swine Influenza Virus Vaccine in Weaned Pigs

    USDA-ARS?s Scientific Manuscript database

    Purpose: To evaluate and compare humoral and cellular immune responses to inactivated swine influenza virus (SIV) vaccine. Methods: Fifty 3-week-old weaned pigs from a herd free of SIV and PRRSV were randomly divided into the non-vaccinated control group and vaccinated group containing 25 pigs each....

  15. Safety Solutions and Differences in Motor Vehicle Drivers Who Use Cellular Telephones

    ERIC Educational Resources Information Center

    Eidelman, James Andrew

    2013-01-01

    The purpose of the proposed study was to determine whether there is a correlation between attitudes toward the use of handheld cellular telephone devices while driving and the use of such devices while driving, with particular interest in the role of gender, age, marital status, parental status, age of children, having a disability, level of…

  16. Layer Anti-Ferromagnetism on Bilayer Honeycomb Lattice

    PubMed Central

    Tao, Hong-Shuai; Chen, Yao-Hua; Lin, Heng-Fu; Liu, Hai-Di; Liu, Wu-Ming

    2014-01-01

    Bilayer honeycomb lattice, with inter-layer tunneling energy, has a parabolic dispersion relation, and the inter-layer hopping can cause the charge imbalance between two sublattices. Here, we investigate the metal-insulator and magnetic phase transitions on the strongly correlated bilayer honeycomb lattice by cellular dynamical mean-field theory combined with continuous time quantum Monte Carlo method. The procedures of magnetic spontaneous symmetry breaking on dimer and non-dimer sites are different, causing a novel phase transition between normal anti-ferromagnet and layer anti-ferromagnet. The whole phase diagrams about the magnetism, temperature, interaction and inter-layer hopping are obtained. Finally, we propose an experimental protocol to observe these phenomena in future optical lattice experiments. PMID:24947369

  17. Transparent metal model study of the use of a cellular growth front to form aligned monotectic composite materials

    NASA Technical Reports Server (NTRS)

    Kaukler, William F.

    1988-01-01

    The purpose of this work was to resolve a scientific controversy in the understanding of how second phase particles become aligned during unidirectional growth of a monotectic alloy. A second aspect was to make the first systematic observations of the solidification behavior of a monotectic alloy during cellular growth in-situ. This research provides the first systematic transparent model study of cellular solidification. An interface stability diagram was developed for the planar to cellular transition of the succinonitrile glycerol (SNG) system. A method was developed utilizing Fourier Transform Infrared Spectroscopy which allows quantitative compositional analysis of directionally solidified SNG along the growth axis. To determine the influence of cellular growth front on alignment for directionally solidified monotectic alloys, the planar and cellular growth morphology was observed in-situ for SNG between 8 and 17 percent glycerol and for a range of over two orders of magnitude G/R.

  18. A Simulation Study Comparison of Bayesian Estimation with Conventional Methods for Estimating Unknown Change Points

    ERIC Educational Resources Information Center

    Wang, Lijuan; McArdle, John J.

    2008-01-01

    The main purpose of this research is to evaluate the performance of a Bayesian approach for estimating unknown change points using Monte Carlo simulations. The univariate and bivariate unknown change point mixed models were presented and the basic idea of the Bayesian approach for estimating the models was discussed. The performance of Bayesian…

  19. Rotation to a Partially Specified Target Matrix in Exploratory Factor Analysis: How Many Targets?

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Ahn, Soyeon; Jin, Ying

    2013-01-01

    The purpose of this study was to explore the influence of the number of targets specified on the quality of exploratory factor analysis solutions with a complex underlying structure and incomplete substantive measurement theory. Three Monte Carlo studies were performed based on the ratio of the number of observed variables to the number of…

  20. Dose response of alanine detectors irradiated with carbon ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrmann, Rochus; Jaekel, Oliver; Palmans, Hugo

    Purpose: The dose response of the alanine detector shows a dependence on particle energy and type when irradiated with ion beams. The purpose of this study is to investigate the response behavior of the alanine detector in clinical carbon ion beams and compare the results to model predictions. Methods: Alanine detectors have been irradiated with carbon ions with an energy range of 89-400 MeV/u. The relative effectiveness of alanine has been measured in this regime. Pristine and spread out Bragg peak depth-dose curves have been measured with alanine dosimeters. The track structure based alanine response model developed by Hansen andmore » Olsen has been implemented in the Monte Carlo code FLUKA and calculations were compared to experimental results. Results: Calculations of the relative effectiveness deviate less than 5% from the measured values for monoenergetic beams. Measured depth-dose curves deviate from predictions in the peak region, most pronounced at the distal edge of the peak. Conclusions: The used model and its implementation show a good overall agreement for quasimonoenergetic measurements. Deviations in depth-dose measurements are mainly attributed to uncertainties of the detector geometry implemented in the Monte Carlo simulations.« less

  1. Monte Carlo modeling of a conventional X-ray computed tomography scanner for gel dosimetry purposes.

    PubMed

    Hayati, Homa; Mesbahi, Asghar; Nazarpoor, Mahmood

    2016-01-01

    Our purpose in the current study was to model an X-ray CT scanner with the Monte Carlo (MC) method for gel dosimetry. In this study, a conventional CT scanner with one array detector was modeled with use of the MCNPX MC code. The MC calculated photon fluence in detector arrays was used for image reconstruction of a simple water phantom as well as polyacrylamide polymer gel (PAG) used for radiation therapy. Image reconstruction was performed with the filtered back-projection method with a Hann filter and the Spline interpolation method. Using MC results, we obtained the dose-response curve for images of irradiated gel at different absorbed doses. A spatial resolution of about 2 mm was found for our simulated MC model. The MC-based CT images of the PAG gel showed a reliable increase in the CT number with increasing absorbed dose for the studied gel. Also, our results showed that the current MC model of a CT scanner can be used for further studies on the parameters that influence the usability and reliability of results, such as the photon energy spectra and exposure techniques in X-ray CT gel dosimetry.

  2. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  3. SU-E-T-553: Monte Carlo Calculation of Proton Bragg Peak Displacements in the Presence of Al2O3:C Dosimeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, L; Yang, F

    2015-06-15

    Purpose: The application of optically stimulated luminescence dosimeters (OSLDs) may be extended to clinical investigations verifying irradiated doses in small animal models. In proton beams, the accurate positioning of the Bragg peak is essential for tumor targeting. The purpose of this study was to estimate the displacement of a pristine Bragg peak when an Al2O3:C nanodot (Landauer, Inc.) is placed on the surface of a water phantom and to evaluate corresponding changes in dose. Methods: Clinical proton pencil beam simulations were carried out with using TOPAS, a Monte Carlo platform layered on top of GEANT4. Point-shaped beams with no energymore » spread were modeled for energies 100MV, 150MV, 200MV, and 250MV. Dose scoring for 100,000 particle histories was conducted within a water phantom (20cm × 20cm irradiated area, 40cm depth) with its surface placed 214.5cm away from the source. The modeled nanodot had a 4mm radius and 0.2mm thickness. Results: A comparative analysis of Monte Carlo depth dose profiles modeled for these proton pencil beams did not demonstrate an energy dependent in the Bragg peak shift. The shifts in Bragg Peak depth for water phantoms modeled with a nanodot on the phantom surface ranged between 2.7 to 3.2 mm. In all cases, the Bragg Peaks were shifted closer to the irradiation source. The peak dose in phantoms with an OSLD remained unchanged with percent dose differences less than 0.55% when compared to phantom doses without the nanodot. Conclusion: Monte Carlo calculations show that the presence of OSLD nanodots in proton beam therapy will not change the position of a pristine Bragg Peak by more than 3 mm. Although the 3.0 mm shift will not have a detrimental effect in patients receiving proton therapy, this effect may not be negligible in dose verification measurements for mouse models at lower proton beam energies.« less

  4. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  5. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  6. A Multi-Purpose, Detector-Based Photometric Calibration System for Luminous Intensity, Illuminance and Luminance

    NASA Astrophysics Data System (ADS)

    Lam, Brenda H. S.; Yang, Steven S. L.; Chau, Y. C.

    2018-02-01

    A multi-purpose detector based calibration system for luminous intensity, illuminance and luminance has been developed at the Government of the Hong Kong Special Administrative Region, Standards and Calibration Laboratory (SCL). In this paper, the measurement system and methods are described. The measurement models and contributory uncertainties were validated using the Guide to the Expression of Uncertainty in Measurement (GUM) framework and Supplement 1 to the GUM - Propagation of distributions using a Monte Carlo method in accordance with the JCGM 100:2008 and JCGM 101:2008 at the intended precision level.

  7. Dosimetric evaluation of internal shielding in a high dose rate skin applicator

    PubMed Central

    Granero, Domingo; Perez-Calatayud, Jose; Carmona, Vicente; Pujades, M Carmen; Ballester, Facundo

    2011-01-01

    Purpose The Valencia HDR applicators are accessories of the microSelectron HDR afterloading system (Nucletron) shaped as truncated cones. The base of the cone is either 2 or 3 cm diameter. They are intended to treat skin lesions, being the typical prescription depth 3 mm. In patients with eyelid lesions, an internal shielding is very useful to reduce the dose to the ocular globe. The purpose of this work was to evaluate the dose enhancement from potential backscatter and electron contamination due to the shielding. Material and methods Two methods were used: a) Monte Carlo simulation, performed with the GEANT4 code, 2 cm Valencia applicator was placed on the surface of a water phantom in which 2 mm lead slab was located at 3 mm depth; b) radiochromic EBT films, used to verify the Monte Carlo results, positioning the films at 1.5, 3, 5 and 7 mm depth, inside the phantom. Two irradiations, with and without the lead shielding slab, were carried out. Results The Monte Carlo results showed that due to the backscatter component from the lead, the dose level raised to about 200% with a depth range of 0.5 mm. Under the lead the dose level was enhanced to about 130% with a depth range of 1 mm. Two millimeters of lead reduce the dose under the slab with about 60%. These results agree with film measurements within uncertainties. Conclusions In conclusion, the use of 2 mm internal lead shielding in eyelid skin treatments with the Valencia applicators were evaluated using MC methods and EBT film dosimetry. The minimum bolus thickness that was needed above and below the shielding was 0.5 mm and 1 mm respectively, and the shielding reduced the absorbed dose delivered to the ocular globe by about 60%. PMID:27877198

  8. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  9. SU-F-T-63: Dosimetric Relevance of the Valencia and Leipzig HDR Applicators Plastic Cap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granero, D; Candela-Juan, C; Vijande, J

    Purpose: Utilization of HDR brachytherapy treatment of skin lesions using collimated applicators, such as the Valencia or Leipzig is increasing. These applicators are made of cup-shaped tungsten material in order to focalize the radiation into the lesion and to protect nearby tissues. These applicators have an attachable plastic cap that removes secondary electrons generated in the applicator and flattens the treatment surface. The purpose of this study is to examine the dosimetric impact of this cap, and the effect if the cap is not placed during the HDR fraction delivery. Methods: Monte Carlo simulations have been done using the codemore » Geant4 for the Valencia and Leipzig applicators. Dose rate distributions have been obtained for the applicators with and without the plastic cap. An experimental study using EBT3 radiochromic film has been realized in order to verify the Monte Carlo results. Results: The Monte Carlo simulations show that absorbed dose in the first millimeter of skin can increase up to 180% for the Valencia applicator if the plastic cap is absent and up to 1500% for the Leipzig applicators. At deeper distances the increase of dose is smaller being about 10–15%. Conclusion: Important differences have been found if the plastic cap of the applicators is absent in the treatment producing an overdosage in the skin. The user should have a checklist to remind him check always before HDR fraction delivery to insure the plastic cap is placed on the applicator. This work was supported in part by Generalitat Valenciana under Project PROMETEOII/2013/010, by the Spanish Government under Project No. FIS2013-42156, and by a research agreement with Elekta Brachytherapy, Veenendaal, The Netherlands.« less

  10. 49 CFR 220.301 - Purpose and application.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF TRANSPORTATION RAILROAD COMMUNICATIONS Electronic Devices § 220.301 Purpose and application. (a... or cellular phones) and laptop computers. (b) The applicability of this subpart is governed by § 220.3; this subpart, however, does not affect the use of working wireless communications pursuant to...

  11. 49 CFR 220.301 - Purpose and application.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., DEPARTMENT OF TRANSPORTATION RAILROAD COMMUNICATIONS Electronic Devices § 220.301 Purpose and application. (a... or cellular phones) and laptop computers. (b) The applicability of this subpart is governed by § 220.3; this subpart, however, does not affect the use of working wireless communications pursuant to...

  12. 49 CFR 220.301 - Purpose and application.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF TRANSPORTATION RAILROAD COMMUNICATIONS Electronic Devices § 220.301 Purpose and application. (a... or cellular phones) and laptop computers. (b) The applicability of this subpart is governed by § 220.3; this subpart, however, does not affect the use of working wireless communications pursuant to...

  13. 49 CFR 220.301 - Purpose and application.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF TRANSPORTATION RAILROAD COMMUNICATIONS Electronic Devices § 220.301 Purpose and application. (a... or cellular phones) and laptop computers. (b) The applicability of this subpart is governed by § 220.3; this subpart, however, does not affect the use of working wireless communications pursuant to...

  14. 49 CFR 220.301 - Purpose and application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF TRANSPORTATION RAILROAD COMMUNICATIONS Electronic Devices § 220.301 Purpose and application. (a... or cellular phones) and laptop computers. (b) The applicability of this subpart is governed by § 220.3; this subpart, however, does not affect the use of working wireless communications pursuant to...

  15. SU-F-T-444: Quality Improvement Review of Radiation Therapy Treatment Planning in the Presence of Dental Implants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parenica, H; Ford, J; Mavroidis, P

    Purpose: To quantify and compare the effect of metallic dental implants (MDI) on dose distributions calculated using Collapsed Cone Convolution Superposition (CCCS) algorithm or a Monte Carlo algorithm (with and without correcting for the density of the MDI). Methods: Seven previously treated patients to the head and neck region were included in this study. The MDI and the streaking artifacts on the CT images were carefully contoured. For each patient a plan was optimized and calculated using the Pinnacle3 treatment planning system (TPS). For each patient two dose calculations were performed, a) with the densities of the MDI and CTmore » artifacts overridden (12 g/cc and 1 g/cc respectively) and b) without density overrides. The plans were then exported to the Monaco TPS and recalculated using Monte Carlo dose calculation algorithm. The changes in dose to PTVs and surrounding Regions of Interest (ROIs) were examined between all plans. Results: The Monte Carlo dose calculation indicated that PTVs received 6% lower dose than the CCCS algorithm predicted. In some cases, the Monte Carlo algorithm indicated that surrounding ROIs received higher dose (up to a factor of 2). Conclusion: Not properly accounting for dental implants can impact both the high dose regions (PTV) and the low dose regions (OAR). This study implies that if MDI and the artifacts are not appropriately contoured and given the correct density, there is potential significant impact on PTV coverage and OAR maximum doses.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titt, U; Suzuki, K

    Purpose: The PTCH is preparing the ocular proton beam nozzle for clinical use. Currently commissioning measurements are being performed using films, diodes and ionization chambers. In parallel, a Monte Carlo model of the beam line was created for integration into the automated Monte Carlo treatment plan computation system, MC{sup 2}. This work aims to compare Monte Carlo predictions to measured proton doses in order to validate the Monte Carlo model. Methods: A complete model of the double scattering ocular beam line has been created and is capable of simulating proton beams with a comprehensive set of beam modifying devices, includingmore » eleven different range modulator wheels. Simulations of doses in water were scored and compare to ion chamber measurements of depth doses, lateral dose profiles extracted from half beam block exposures of films, and diode measurements of lateral penumbrae at various depths. Results: All comparison resulted in an average relative entrance dose difference of less than 3% and peak dose difference of less than 2%. All range differences were smaller than 0.2 mm. The differences in the lateral beam profiles were smaller than 0.2 mm, and the differences in the penumbrae were all smaller than 0.4%. Conclusion: All available data shows excellent agreement of simulations and measurements. More measurements will have to be performed in order to completely and systematically validate the model. Besides simulating and measuring PDDs and lateral profiles of all remaining range modulator wheels, the absolute dosimetry factors in terms of number of source protons per monitor unit have to be determined.« less

  17. Effectiveness of Conceptual Change Text Oriented Instruction on Students' Understanding of Cellular Respiration Concepts.

    ERIC Educational Resources Information Center

    Cakir, Ozlem S.; Yuruk, Nejla; Geban, Omer

    The purpose of the study is to compare the effectiveness of conceptual change text oriented instruction and traditional instruction on students' understanding of cellular respiration concepts and their attitudes toward biology as a school subject. The sample of this study consisted of 84 eleventh-grade students from the 4 classes of a high school.…

  18. A Hybrid Monte Carlo importance sampling of rare events in Turbulence and in Turbulent Models

    NASA Astrophysics Data System (ADS)

    Margazoglou, Georgios; Biferale, Luca; Grauer, Rainer; Jansen, Karl; Mesterhazy, David; Rosenow, Tillmann; Tripiccione, Raffaele

    2017-11-01

    Extreme and rare events is a challenging topic in the field of turbulence. Trying to investigate those instances through the use of traditional numerical tools turns to be a notorious task, as they fail to systematically sample the fluctuations around them. On the other hand, we propose that an importance sampling Monte Carlo method can selectively highlight extreme events in remote areas of the phase space and induce their occurrence. We present a brand new computational approach, based on the path integral formulation of stochastic dynamics, and employ an accelerated Hybrid Monte Carlo (HMC) algorithm for this purpose. Through the paradigm of stochastic one-dimensional Burgers' equation, subjected to a random noise that is white-in-time and power-law correlated in Fourier space, we will prove our concept and benchmark our results with standard CFD methods. Furthermore, we will present our first results of constrained sampling around saddle-point instanton configurations (optimal fluctuations). The research leading to these results has received funding from the EU Horizon 2020 research and innovation programme under Grant Agreement No. 642069, and from the EU Seventh Framework Programme (FP7/2007-2013) under ERC Grant Agreement No. 339032.

  19. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  20. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Benchmarking study of the MCNP code against cold critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitaraman, S.

    1991-01-01

    The purpose of this study was to benchmark the widely used Monte Carlo code MCNP against a set of cold critical experiments with a view to using the code as a means of independently verifying the performance of faster but less accurate Monte Carlo and deterministic codes. The experiments simulated consisted of both fast and thermal criticals as well as fuel in a variety of chemical forms. A standard set of benchmark cold critical experiments was modeled. These included the two fast experiments, GODIVA and JEZEBEL, the TRX metallic uranium thermal experiments, the Babcock and Wilcox oxide and mixed oxidemore » experiments, and the Oak Ridge National Laboratory (ORNL) and Pacific Northwest Laboratory (PNL) nitrate solution experiments. The principal case studied was a small critical experiment that was performed with boiling water reactor bundles.« less

  2. The MCNP Simulation of a PGNAA System at TRR-1/M1

    NASA Astrophysics Data System (ADS)

    Sangaroon, S.; Ratanatongchai, W.; Picha, R.; Khaweerat, S.; Channuie, J.

    2017-06-01

    The prompt-gamma neutron activation analysis system (PGNAA) has been installed at Thai Research Reactor-1/Modified 1 (TRR-1/M1) since 1999. The purpose of the system is for elemental and isotopic analyses. The system mainly consists of a series of the moderator and collimator, neutron and gamma-ray shielding and the HPGe detector. In this work, the condition of the system is carried out based on the Monte Carlo method using Monte Carlo N-Particle transport code and the experiment. The flux ratios (Φthermal/Φepithermal and Φthermal/Φfast) and thermal neutron flux have been obtained. The simulated prompt gamma rays of the Portland cement sample have been carried out. The simulation provides significant contribution in upgrading the PGNAA station to be available in various applications.

  3. Features of MCNP6 Relevant to Medical Radiation Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, H. Grady III; Goorley, John T.

    2012-08-29

    MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less

  4. Solving the master equation without kinetic Monte Carlo: Tensor train approximations for a CO oxidation model

    NASA Astrophysics Data System (ADS)

    Gelß, Patrick; Matera, Sebastian; Schütte, Christof

    2016-06-01

    In multiscale modeling of heterogeneous catalytic processes, one crucial point is the solution of a Markovian master equation describing the stochastic reaction kinetics. Usually, this is too high-dimensional to be solved with standard numerical techniques and one has to rely on sampling approaches based on the kinetic Monte Carlo method. In this study we break the curse of dimensionality for the direct solution of the Markovian master equation by exploiting the Tensor Train Format for this purpose. The performance of the approach is demonstrated on a first principles based, reduced model for the CO oxidation on the RuO2(110) surface. We investigate the complexity for increasing system size and for various reaction conditions. The advantage over the stochastic simulation approach is illustrated by a problem with increased stiffness.

  5. The problem of measurement model misspecification in behavioral and organizational research and some recommended solutions.

    PubMed

    MacKenzie, Scott B; Podsakoff, Philip M; Jarvis, Cheryl Burke

    2005-07-01

    The purpose of this study was to review the distinction between formative- and reflective-indicator measurement models, articulate a set of criteria for deciding whether measures are formative or reflective, illustrate some commonly researched constructs that have formative indicators, empirically test the effects of measurement model misspecification using a Monte Carlo simulation, and recommend new scale development procedures for latent constructs with formative indicators. Results of the Monte Carlo simulation indicated that measurement model misspecification can inflate unstandardized structural parameter estimates by as much as 400% or deflate them by as much as 80% and lead to Type I or Type II errors of inference, depending on whether the exogenous or the endogenous latent construct is misspecified. Implications of this research are discussed. Copyright 2005 APA, all rights reserved.

  6. Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison for GPU and MIC Parallel Computing Devices

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Liu, Tianyu; Su, Lin; Bednarz, Bryan; Caracappa, Peter; Xu, X. George

    2017-09-01

    Monte Carlo (MC) simulation is well recognized as the most accurate method for radiation dose calculations. For radiotherapy applications, accurate modelling of the source term, i.e. the clinical linear accelerator is critical to the simulation. The purpose of this paper is to perform source modelling and examine the accuracy and performance of the models on Intel Many Integrated Core coprocessors (aka Xeon Phi) and Nvidia GPU using ARCHER and explore the potential optimization methods. Phase Space-based source modelling for has been implemented. Good agreements were found in a tomotherapy prostate patient case and a TrueBeam breast case. From the aspect of performance, the whole simulation for prostate plan and breast plan cost about 173s and 73s with 1% statistical error.

  7. Comparison of experimental proton-induced fluorescence spectra for a selection of thin high-Z samples with Geant4 Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Incerti, S.; Barberet, Ph.; Dévès, G.; Michelet, C.; Francis, Z.; Ivantchenko, V.; Mantero, A.; El Bitar, Z.; Bernal, M. A.; Tran, H. N.; Karamitros, M.; Seznec, H.

    2015-09-01

    The general purpose Geant4 Monte Carlo simulation toolkit is able to simulate radiative and non-radiative atomic de-excitation processes such as fluorescence and Auger electron emission, occurring after interaction of incident ionising radiation with target atomic electrons. In this paper, we evaluate the Geant4 modelling capability for the simulation of fluorescence spectra induced by 1.5 MeV proton irradiation of thin high-Z foils (Fe, GdF3, Pt, Au) with potential interest for nanotechnologies and life sciences. Simulation results are compared to measurements performed at the Centre d'Etudes Nucléaires de Bordeaux-Gradignan AIFIRA nanobeam line irradiation facility in France. Simulation and experimental conditions are described and the influence of Geant4 electromagnetic physics models is discussed.

  8. Monte Carlo simulations of {sup 3}He ion physical characteristics in a water phantom and evaluation of radiobiological effectiveness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, Reza; Guan, Fada; Peeler, Chris

    Purpose: {sup 3}He ions may hold great potential for clinical therapy because of both their physical and biological properties. In this study, the authors investigated the physical properties, i.e., the depth-dose curves from primary and secondary particles, and the energy distributions of helium ({sup 3}He) ions. A relative biological effectiveness (RBE) model was applied to assess the biological effectiveness on survival of multiple cell lines. Methods: In light of the lack of experimental measurements and cross sections, the authors used Monte Carlo methods to study the energy deposition of {sup 3}He ions. The transport of {sup 3}He ions in watermore » was simulated by using three Monte Carlo codes—FLUKA, GEANT4, and MCNPX—for incident beams with Gaussian energy distributions with average energies of 527 and 699 MeV and a full width at half maximum of 3.3 MeV in both cases. The RBE of each was evaluated by using the repair-misrepair-fixation model. In all of the simulations with each of the three Monte Carlo codes, the same geometry and primary beam parameters were used. Results: Energy deposition as a function of depth and energy spectra with high resolution was calculated on the central axis of the beam. Secondary proton dose from the primary {sup 3}He beams was predicted quite differently by the three Monte Carlo systems. The predictions differed by as much as a factor of 2. Microdosimetric parameters such as dose mean lineal energy (y{sub D}), frequency mean lineal energy (y{sub F}), and frequency mean specific energy (z{sub F}) were used to characterize the radiation beam quality at four depths of the Bragg curve. Calculated RBE values were close to 1 at the entrance, reached on average 1.8 and 1.6 for prostate and head and neck cancer cell lines at the Bragg peak for both energies, but showed some variations between the different Monte Carlo codes. Conclusions: Although the Monte Carlo codes provided different results in energy deposition and especially in secondary particle production (most of the differences between the three codes were observed close to the Bragg peak, where the energy spectrum broadens), the results in terms of RBE were generally similar.« less

  9. Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dirgayussa, I Gde Eka, E-mail: ekadirgayussa@gmail.com; Yani, Sitti; Haryanto, Freddy, E-mail: freddy@fi.itb.ac.id

    2015-09-30

    Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDPmore » and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good criteria of dose difference in PDD and dose profiles were achieve using incident electron energy 6.4 MeV.« less

  10. SU-E-T-519: Investigation of the CyberKnife MultiPlan Monte Carlo Dose Calculation Using EBT3 Film Absolute Dosimetry for Delivery in a Heterogeneous Thorax Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamberto, M; Chen, H; Huang, K

    2015-06-15

    Purpose To characterize the Cyberknife (CK) robotic system’s dosimetric accuracy of the delivery of MultiPlan’s Monte Carlo dose calculations using EBT3 radiochromic film inserted in a thorax phantom. Methods The CIRS XSight Lung Tracking (XLT) Phantom (model 10823) was used in this study with custom cut EBT3 film inserted in the horizontal (coronal) plane inside the lung tissue equivalent phantom. CK MultiPlan v3.5.3 with Monte Carlo dose calculation algorithm (1.5 mm grid size, 2% statistical uncertainty) was used to calculate a clinical plan for a 25-mm lung tumor lesion, as contoured by the physician, and then imported onto the XLTmore » phantom CT. Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0– 800 cGy. The test films (n=3) were irradiated using 325 cGy to the prescription point. Films were scanned 48 hours after irradiation using an Epson v700 scanner (48 bits color scan, extracted red channel only, 96 dpi). Percent absolute dose and relative isodose distribution difference relative to the planned dose were quantified using an in-house QA software program. Multiplan Monte Carlo dose calculation was validated using RCF dosimetry (EBT3) and gamma index criteria of 3%/3mm and 2%/2mm for absolute dose and relative isodose distribution measurement comparisons. Results EBT3 film measurements of the patient plans calculated with Monte Carlo in MultiPlan resulted in an absolute dose passing rate of 99.6±0.4% for the Gamma Index of 3%/3mm, 10% dose threshold, and 95.6±4.4% for 2%/2mm, 10% threshold criteria. The measured central axis absolute dose was within 1.2% (329.0±2.5 cGy) of the Monte Carlo planned dose (325.0±6.5 cGy) for that same point. Conclusion MultiPlan’s Monte Carlo dose calculation was validated using the EBT3 film absolute dosimetry for delivery in a heterogeneous thorax phantom.« less

  11. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  12. Dynamical analysis of cellular ageing by modeling of gene regulatory network based attractor landscape.

    PubMed

    Chong, Ket Hing; Zhang, Xiaomeng; Zheng, Jie

    2018-01-01

    Ageing is a natural phenomenon that is inherently complex and remains a mystery. Conceptual model of cellular ageing landscape was proposed for computational studies of ageing. However, there is a lack of quantitative model of cellular ageing landscape. This study aims to investigate the mechanism of cellular ageing in a theoretical model using the framework of Waddington's epigenetic landscape. We construct an ageing gene regulatory network (GRN) consisting of the core cell cycle regulatory genes (including p53). A model parameter (activation rate) is used as a measure of the accumulation of DNA damage. Using the bifurcation diagrams to estimate the parameter values that lead to multi-stability, we obtained a conceptual model for capturing three distinct stable steady states (or attractors) corresponding to homeostasis, cell cycle arrest, and senescence or apoptosis. In addition, we applied a Monte Carlo computational method to quantify the potential landscape, which displays: I) one homeostasis attractor for low accumulation of DNA damage; II) two attractors for cell cycle arrest and senescence (or apoptosis) in response to high accumulation of DNA damage. Using the Waddington's epigenetic landscape framework, the process of ageing can be characterized by state transitions from landscape I to II. By in silico perturbations, we identified the potential landscape of a perturbed network (inactivation of p53), and thereby demonstrated the emergence of a cancer attractor. The simulated dynamics of the perturbed network displays a landscape with four basins of attraction: homeostasis, cell cycle arrest, senescence (or apoptosis) and cancer. Our analysis also showed that for the same perturbed network with low DNA damage, the landscape displays only the homeostasis attractor. The mechanistic model offers theoretical insights that can facilitate discovery of potential strategies for network medicine of ageing-related diseases such as cancer.

  13. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, K; Lee, J; Sterpin, E

    2014-06-15

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed andmore » accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time constraints. It has been successfully validated with Geant4. This work has been financialy supported by InVivoIGT, a public/private partnership between UCL and IBA.« less

  14. Writing Treatment for Aphasia: A Texting Approach

    ERIC Educational Resources Information Center

    Beeson, Pelagie M.; Higginson, Kristina; Rising, Kindle

    2013-01-01

    Purpose: Treatment studies have documented the therapeutic and functional value of lexical writing treatment for individuals with severe aphasia. The purpose of this study was to determine whether such retraining could be accomplished using the typing feature of a cellular telephone, with the ultimate goal of using text messaging for…

  15. MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gholampourkashi, S; Cygler, J E.; The Ottawa Hospital Cancer Centre, Ottawa, ON

    Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetrymore » system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO), funded by the Ontario Research Fund Research Excellence program.« less

  16. Versatile Analysis of Single-Molecule Tracking Data by Comprehensive Testing against Monte Carlo Simulations

    PubMed Central

    Wieser, Stefan; Axmann, Markus; Schütz, Gerhard J.

    2008-01-01

    We propose here an approach for the analysis of single-molecule trajectories which is based on a comprehensive comparison of an experimental data set with multiple Monte Carlo simulations of the diffusion process. It allows quantitative data analysis, particularly whenever analytical treatment of a model is infeasible. Simulations are performed on a discrete parameter space and compared with the experimental results by a nonparametric statistical test. The method provides a matrix of p-values that assess the probability for having observed the experimental data at each setting of the model parameters. We show the testing approach for three typical situations observed in the cellular plasma membrane: i), free Brownian motion of the tracer, ii), hop diffusion of the tracer in a periodic meshwork of squares, and iii), transient binding of the tracer to slowly diffusing structures. By plotting the p-value as a function of the model parameters, one can easily identify the most consistent parameter settings but also recover mutual dependencies and ambiguities which are difficult to determine by standard fitting routines. Finally, we used the test to reanalyze previous data obtained on the diffusion of the glycosylphosphatidylinositol-protein CD59 in the plasma membrane of the human T24 cell line. PMID:18805933

  17. A monte carlo study of restricted diffusion: Implications for diffusion MRI of prostate cancer.

    PubMed

    Gilani, Nima; Malcolm, Paul; Johnson, Glyn

    2017-04-01

    Diffusion MRI is used frequently to assess prostate cancer. The prostate consists of cellular tissue surrounding fluid filled ducts. Here, the diffusion properties of the ductal fluid alone were studied. Monte Carlo simulations were used to investigate ductal residence times to determine whether ducts can be regarded as forming a separate compartment and whether ductal radius could determine the Apparent Diffusion Coefficient (ADC) of the ductal fluid. Random walks were simulated in cavities. Average residence times were estimated for permeable cavities. Signal reductions resulting from application of a Stejskal-Tanner pulse sequence were calculated in impermeable cavities. Simulations were repeated for cavities of different radii and different diffusion times. Residence times are at least comparable with diffusion times even in relatively high grade tumors. ADCs asymptotically approach theoretical limiting values. At large radii and short diffusion times, ADCs are similar to free diffusion. At small radii and long diffusion times, ADCs are reduced toward zero, and kurtosis approaches a value of -1.2. Restricted diffusion in cavities of similar sizes to prostate ducts may reduce ductal ADCs. This may contribute to reductions in total ADC seen in prostate cancer. Magn Reson Med 77:1671-1677, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Spin glass model for dynamics of cell reprogramming

    NASA Astrophysics Data System (ADS)

    Pusuluri, Sai Teja; Lang, Alex H.; Mehta, Pankaj; Castillo, Horacio E.

    2015-03-01

    Recent experiments show that differentiated cells can be reprogrammed to become pluripotent stem cells. The possible cell fates can be modeled as attractors in a dynamical system, the ``epigenetic landscape.'' Both cellular differentiation and reprogramming can be described in the landscape picture as motion from one attractor to another attractor. We perform Monte Carlo simulations in a simple model of the landscape. This model is based on spin glass theory and it can be used to construct a simulated epigenetic landscape starting from the experimental genomic data. We re-analyse data from several cell reprogramming experiments and compare with our simulation results. We find that the model can reproduce some of the main features of the dynamics of cell reprogramming.

  19. TILDA-V: A full-differential code for proton tracking in biological matter

    DOE PAGES

    Quinto, M. A.; Monti, J. M.; Week, Philippe F.; ...

    2015-09-07

    Understanding the radiation-induced effects at the cellular level is of prime importance for predicting the fate of irradiated biological organisms. Thus, whether it is in radiobiology to identify the DNA critical lesions or in medicine to adapt the radio-therapeutic protocols, an accurate knowledge of the numerous interactions induced by charged particles in living matter is required. Monte-Carlo track-structure simulations represent the most suitable and powerful tools, in particular for modelling the full slowing-down of the ionizing particles in biological matter. Furthermore more of the existing codes are based on semi-empirical cross sections as well as the use of water asmore » surrogate of the biological matter.« less

  20. In situ generation of ultrafast transient "acid spikes" in the 10B(n,α)7Li radiolysis of water

    NASA Astrophysics Data System (ADS)

    Islam, Muhammad Mainul; Kanike, Vanaja; Meesungnoen, Jintana; Lertnaisat, Phantira; Katsumura, Yosuke; Jay-Gerin, Jean-Paul

    2018-02-01

    Monte Carlo track chemistry simulations of the 10B(n,α)7Li radiolysis of water show that the in situ formation of H3O+ by the two He and Li recoiling ions renders the native track regions temporarily very acidic. For these irradiating ions, the pH remains near 0 at times less than ∼100 ps after which the system gradually returns to neutral pH at ∼0.1 ms. These 'acid spikes' have never been invoked in water or in a cellular environment exposed to densely ionizing radiations. The question of their implications in boron neutron capture therapy and, more generally, in hadrontherapy, is discussed briefly.

  1. MO-FG-BRA-03: A Monte-Carlo Study of Cellular Dosimetry of Radioactive Gold-Palladium Nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Y; Michaud, F; Fortin, M

    Purpose: Radioactive gold-palladium nanoparticles ({sup 103}Pd:Pd@Au NPs) are being developed for prostate cancer brachytherapy. Photons emitted by the radioisotope palladium (photon energy: 20.1 and 23.0 keV), interacting with gold-coating of NPs, lead to enhanced energy distribution in nucleus. Here, a simple cellular model was studied using detailed track-structure method. Methods: Geant4-DNA was used with auger electrons enabled. Biological cell was modeled as a sphere of radius r=5 µm that were immersed in a fluid containing large number of NPs at different concentrations (S=1, 2.15, 5.1, 17.2 mg-Au/g-H2O). Nucleus was modeled as a concentric sphere (r=3µm). Thickness of gold-coating on {supmore » 103}Pd core was 15nm, 20nm and 25nm, respectively. A scenario of NP diffusion was investigated, where S=5.1 mg-Au/g-H2O outside cell and S=1 mg-Au/g-H2O in cytoplasm. 10{sup 10} {sup 103}Pd decays were simulated for each combination of NP concentration and gold-coating. Results: A uniform increase in energy deposition (Edep) is observed in cell nucleus and the energy enhancement ratio (EER) is 1.16, 1.22 and 1.3 for 15nm, 20nm and 25nm of gold -coatings, respectively. Edep at the center of nucleus is increased by a factor of 1.47, 2.51 and 5.54 when the NP concentration in the cytoplasm increases from 1 mg-Au/g-H2O to 2.15, 5.10 and 17.2 mg-Au/g-H2O, respectively. When NPs diffuse into cytoplasm, the mean value of Edep in nucleus increases from 0.42 to 1.13 MeV per 10{sup 9} decays (GBq-Second) of {sup 103}Pd and the maximum value increases from 0.54 to 2.5 MeV per GBq-Second. Conclusion: These results suggest that {sup 103}Pd:Pd@Au NPs constitute a promising nanotherapeutic agent. Ongoing studies use transmission electron microscopy (TEM) images of prostate cancer.« less

  2. A Monte Carlo analysis of breast screening randomized trials.

    PubMed

    Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M

    2016-12-01

    To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. SU-E-T-405: Evaluation of the Raystation Electron Monte Carlo Algorithm for Varian Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sansourekidou, P; Allen, C

    2015-06-15

    Purpose: To evaluate the Raystation v4.51 Electron Monte Carlo algorithm for Varian Trilogy, IX and 2100 series linear accelerators and commission for clinical use. Methods: Seventy two water and forty air scans were acquired with a water tank in the form of profiles and depth doses, as requested by vendor. Data was imported into Rayphysics beam modeling module. Energy spectrum was modeled using seven parameters. Contamination photons were modeled using five parameters. Source phase space was modeled using six parameters. Calculations were performed in clinical version 4.51 and percent depth dose curves and profiles were extracted to be compared tomore » water tank measurements. Sensitivity tests were performed for all parameters. Grid size and particle histories were evaluated per energy for statistical uncertainty performance. Results: Model accuracy for air profiles is poor in the shoulder and penumbra region. However, model accuracy for water scans is acceptable. All energies and cones are within 2%/2mm for 90% of the points evaluated. Source phase space parameters have a cumulative effect. To achieve distributions with satisfactory smoothness level a 0.1cm grid and 3,000,000 particle histories were used for commissioning calculations. Calculation time was approximately 3 hours per energy. Conclusion: Raystation electron Monte Carlo is acceptable for clinical use for the Varian accelerators listed. Results are inferior to Elekta Electron Monte Carlo modeling. Known issues were reported to Raysearch and will be resolved in upcoming releases. Auto-modeling is limited to open cone depth dose curves and needs expansion.« less

  4. Applications of Massive Mathematical Computations

    DTIC Science & Technology

    1990-04-01

    particles from the first principles of QCD . This problem is under intensive numerical study 11-6 using special purpose parallel supercomputers in...several places around the world. The method used here is the Monte Carlo integration for a fixed 3-D plus time lattices . Reliable results are still years...mathematical and theoretical physics, but its most promising applications are in the numerical realization of QCD computations. Our programs for the solution

  5. SPAMCART: a code for smoothed particle Monte Carlo radiative transfer

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2016-10-01

    We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.

  6. Solving the master equation without kinetic Monte Carlo: Tensor train approximations for a CO oxidation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelß, Patrick, E-mail: p.gelss@fu-berlin.de; Matera, Sebastian, E-mail: matera@math.fu-berlin.de; Schütte, Christof, E-mail: schuette@mi.fu-berlin.de

    2016-06-01

    In multiscale modeling of heterogeneous catalytic processes, one crucial point is the solution of a Markovian master equation describing the stochastic reaction kinetics. Usually, this is too high-dimensional to be solved with standard numerical techniques and one has to rely on sampling approaches based on the kinetic Monte Carlo method. In this study we break the curse of dimensionality for the direct solution of the Markovian master equation by exploiting the Tensor Train Format for this purpose. The performance of the approach is demonstrated on a first principles based, reduced model for the CO oxidation on the RuO{sub 2}(110) surface.more » We investigate the complexity for increasing system size and for various reaction conditions. The advantage over the stochastic simulation approach is illustrated by a problem with increased stiffness.« less

  7. Determination of efficiency of an aged HPGe detector for gaseous sources by self absorption correction and point source methods

    NASA Astrophysics Data System (ADS)

    Sarangapani, R.; Jose, M. T.; Srinivasan, T. K.; Venkatraman, B.

    2017-07-01

    Methods for the determination of efficiency of an aged high purity germanium (HPGe) detector for gaseous sources have been presented in the paper. X-ray radiography of the detector has been performed to get detector dimensions for computational purposes. The dead layer thickness of HPGe detector has been ascertained from experiments and Monte Carlo computations. Experimental work with standard point and liquid sources in several cylindrical geometries has been undertaken for obtaining energy dependant efficiency. Monte Carlo simulations have been performed for computing efficiencies for point, liquid and gaseous sources. Self absorption correction factors have been obtained using mathematical equations for volume sources and MCNP simulations. Self-absorption correction and point source methods have been used to estimate the efficiency for gaseous sources. The efficiencies determined from the present work have been used to estimate activity of cover gas sample of a fast reactor.

  8. Two-Dimensional Spatial Imaging of Charge Transport in Germanium Crystals at Cryogenic Temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moffatt, Robert

    2016-03-01

    In this dissertation, I describe a novel apparatus for studying the transport of charge in semiconductors at cryogenic temperatures. The motivation to conduct this experiment originated from an asymmetry observed between the behavior of electrons and holes in the germanium detector crystals used by the Cryogenic Dark Matter Search (CDMS). This asymmetry is a consequence of the anisotropic propagation of electrons in germanium at cryogenic temperatures. To better model our detectors, we incorporated this effect into our Monte Carlo simulations of charge transport. The purpose of the experiment described in this dissertation is to test those models in detail. Ourmore » measurements have allowed us to discover a shortcoming in our most recent Monte Carlo simulations of electrons in germanium. This discovery would not have been possible without the measurement of the full, two-dimensional charge distribution, which our experimental apparatus has allowed for the first time at cryogenic temperatures.« less

  9. Elastic constants of hcp 4He: Path-integral Monte Carlo results versus experiment

    NASA Astrophysics Data System (ADS)

    Ardila, Luis Aldemar Peña; Vitiello, Silvio A.; de Koning, Maurice

    2011-09-01

    The elastic constants of hcp 4He are computed using the path-integral Monte Carlo (PIMC) method. The stiffness coefficients are obtained by imposing different distortions to a periodic cell containing 180 atoms, followed by measurement of the elements of the corresponding stress tensor. For this purpose an appropriate path-integral expression for the stress tensor observable is derived and implemented into the pimc++ package. In addition to allowing the determination of the elastic stiffness constants, this development also opens the way to an explicit atomistic determination of the Peierls stress for dislocation motion using the PIMC technique. A comparison of the results to available experimental data shows an overall good agreement of the density dependence of the elastic constants, with the single exception of C13. Additional calculations for the bcc phase, on the other hand, show good agreement for all elastic constants.

  10. Monte Carlo analysis of megavoltage x-ray interaction-induced signal and noise in cadmium tungstate detectors for cargo container inspection

    NASA Astrophysics Data System (ADS)

    Kim, J.; Park, J.; Kim, J.; Kim, D. W.; Yun, S.; Lim, C. H.; Kim, H. K.

    2016-11-01

    For the purpose of designing an x-ray detector system for cargo container inspection, we have investigated the energy-absorption signal and noise in CdWO4 detectors for megavoltage x-ray photons. We describe the signal and noise measures, such as quantum efficiency, average energy absorption, Swank noise factor, and detective quantum efficiency (DQE), in terms of energy moments of absorbed energy distributions (AEDs) in a detector. The AED is determined by using a Monte Carlo simulation. The results show that the signal-related measures increase with detector thickness. However, the improvement of Swank noise factor with increasing thickness is weak, and this energy-absorption noise characteristic dominates the DQE performance. The energy-absorption noise mainly limits the signal-to-noise performance of CdWO4 detectors operated at megavoltage x-ray beam.

  11. Estimation of absorbed doses from paediatric cone-beam CT scans: MOSFET measurements and Monte Carlo simulations.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Toncheva, Greta; Frush, Donald P; Yin, Fang-Fang

    2010-03-01

    The purpose of this study was to establish a dose estimation tool with Monte Carlo (MC) simulations. A 5-y-old paediatric anthropomorphic phantom was computed tomography (CT) scanned to create a voxelised phantom and used as an input for the abdominal cone-beam CT in a BEAMnrc/EGSnrc MC system. An X-ray tube model of the Varian On-Board Imager((R)) was built in the MC system. To validate the model, the absorbed doses at each organ location for standard-dose and low-dose modes were measured in the physical phantom with MOSFET detectors; effective doses were also calculated. In the results, the MC simulations were comparable to the MOSFET measurements. This voxelised phantom approach could produce a more accurate dose estimation than the stylised phantom method. This model can be easily applied to multi-detector CT dosimetry.

  12. Portable multi-node LQCD Monte Carlo simulations using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Sanfilippo, Francesco; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    This paper describes a state-of-the-art parallel Lattice QCD Monte Carlo code for staggered fermions, purposely designed to be portable across different computer architectures, including GPUs and commodity CPUs. Portability is achieved using the OpenACC parallel programming model, used to develop a code that can be compiled for several processor architectures. The paper focuses on parallelization on multiple computing nodes using OpenACC to manage parallelism within the node, and OpenMPI to manage parallelism among the nodes. We first discuss the available strategies to be adopted to maximize performances, we then describe selected relevant details of the code, and finally measure the level of performance and scaling-performance that we are able to achieve. The work focuses mainly on GPUs, which offer a significantly high level of performances for this application, but also compares with results measured on other processors.

  13. An analytical model of leakage neutron equivalent dose for passively-scattered proton radiotherapy and validation with measurements.

    PubMed

    Schneider, Christopher; Newhauser, Wayne; Farah, Jad

    2015-05-18

    Exposure to stray neutrons increases the risk of second cancer development after proton therapy. Previously reported analytical models of this exposure were difficult to configure and had not been investigated below 100 MeV proton energy. The purposes of this study were to test an analytical model of neutron equivalent dose per therapeutic absorbed dose  at 75 MeV and to improve the model by reducing the number of configuration parameters and making it continuous in proton energy from 100 to 250 MeV. To develop the analytical model, we used previously published H/D values in water from Monte Carlo simulations of a general-purpose beamline for proton energies from 100 to 250 MeV. We also configured and tested the model on in-air neutron equivalent doses measured for a 75 MeV ocular beamline. Predicted H/D values from the analytical model and Monte Carlo agreed well from 100 to 250 MeV (10% average difference). Predicted H/D values from the analytical model also agreed well with measurements at 75 MeV (15% average difference). The results indicate that analytical models can give fast, reliable calculations of neutron exposure after proton therapy. This ability is absent in treatment planning systems but vital to second cancer risk estimation.

  14. Technical Note: Dosimetry of Leipzig and Valencia applicators without the plastic cap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granero, D., E-mail: dgranero@eresa.com; Candela-Juan, C.; Vijande, J.

    2016-05-15

    Purpose: High dose rate (HDR) brachytherapy for treatment of small skin lesions using the Leipzig and Valencia applicators is a widely used technique. These applicators are equipped with an attachable plastic cap to be placed during fraction delivery to ensure electronic equilibrium and to prevent secondary electrons from reaching the skin surface. The purpose of this study is to report on the dosimetric impact of the cap being absent during HDR fraction delivery, which has not been explored previously in the literature. Methods: GEANT4 Monte Carlo simulations (version 10.0) have been performed for the Leipzig and Valencia applicators with andmore » without the plastic cap. In order to validate the Monte Carlo simulations, experimental measurements using radiochromic films have been done. Results: Dose absorbed within 1 mm of the skin surface increases by a factor of 1500% for the Leipzig applicators and of 180% for the Valencia applicators. Deeper than 1 mm, the overdosage flattens up to a 10% increase. Conclusions: Differences of treating with or without the plastic cap are significant. Users must check always that the plastic cap is in place before any treatment in order to avoid overdosage of the skin. Prior to skin HDR fraction delivery, the timeout checklist should include verification of the cap placement.« less

  15. Cellular signaling identifiability analysis: a case study.

    PubMed

    Roper, Ryan T; Pia Saccomani, Maria; Vicini, Paolo

    2010-05-21

    Two primary purposes for mathematical modeling in cell biology are (1) simulation for making predictions of experimental outcomes and (2) parameter estimation for drawing inferences from experimental data about unobserved aspects of biological systems. While the former purpose has become common in the biological sciences, the latter is less common, particularly when studying cellular and subcellular phenomena such as signaling-the focus of the current study. Data are difficult to obtain at this level. Therefore, even models of only modest complexity can contain parameters for which the available data are insufficient for estimation. In the present study, we use a set of published cellular signaling models to address issues related to global parameter identifiability. That is, we address the following question: assuming known time courses for some model variables, which parameters is it theoretically impossible to estimate, even with continuous, noise-free data? Following an introduction to this problem and its relevance, we perform a full identifiability analysis on a set of cellular signaling models using DAISY (Differential Algebra for the Identifiability of SYstems). We use our analysis to bring to light important issues related to parameter identifiability in ordinary differential equation (ODE) models. We contend that this is, as of yet, an under-appreciated issue in biological modeling and, more particularly, cell biology. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  16. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  17. SU-F-T-281: Monte Carlo Investigation of Sources of Dosimetric Discrepancies with 2D Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afifi, M; Deiab, N; El-Farrash, A

    2016-06-15

    Purpose: Intensity modulated radiation therapy (IMRT) poses a number of challenges for properly measuring commissioning data and quality assurance (QA). Understanding the limitations and use of dosimeters to measure these dose distributions is critical to safe IMRT implementation. In this work, we used Monte Carlo simulations to investigate the possible sources of discrepancy between our measurement with 2D array system and our dose calculation using our treatment planning system (TPS). Material and Methods: MCBEAM and MCSIM Monte Carlo codes were used for treatment head simulation and phantom dose calculation. Accurate modeling of a 6MV beam from Varian trilogy machine wasmore » verified by comparing simulated and measured percentage depth doses and profiles. Dose distribution inside the 2D array was calculated using Monte Carlo simulations and our TPS. Then Cross profiles for different field sizes were compared with actual measurements for zero and 90° gantry angle setup. Through the analysis and comparison, we tried to determine the differences and quantify a possible angular calibration factor. Results: Minimum discrepancies was seen in the comparison between the simulated and the measured profiles for the zero gantry angles at all studied field sizes (4×4cm{sup 2}, 10×10cm{sup 2}, 15×15cm{sup 2}, and 20×20cm{sup 2}). Discrepancies between our measurements and calculations increased dramatically for the cross beam profiles at the 90° gantry angle. This could ascribe mainly to the different attenuation caused by the layer of electronics at the base behind the ion chambers in the 2D array. The degree of attenuation will vary depending on the angle of beam incidence. Correction factors were implemented to correct the errors. Conclusion: Monte Carlo modeling of the 2D arrays and the derivation of angular dependence correction factors will allow for improved accuracy of the device for IMRT QA.« less

  18. SU-E-T-467: Implementation of Monte Carlo Dose Calculation for a Multileaf Collimator Equipped Robotic Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, JS; Fan, J; Ma, C-M

    Purpose: To improve the treatment efficiency and capabilities for full-body treatment, a robotic radiosurgery system has equipped with a multileaf collimator (MLC) to extend its accuracy and precision to radiation therapy. To model the MLC and include it in the Monte Carlo patient dose calculation is the goal of this work. Methods: The radiation source and the MLC were carefully modeled to consider the effects of the source size, collimator scattering, leaf transmission and leaf end shape. A source model was built based on the output factors, percentage depth dose curves and lateral dose profiles measured in a water phantom.more » MLC leaf shape, leaf end design and leaf tilt for minimizing the interleaf leakage and their effects on beam fluence and energy spectrum were all considered in the calculation. Transmission/leakage was added to the fluence based on the transmission factors of the leaf and the leaf end. The transmitted photon energy was tuned to consider the beam hardening effects. The calculated results with the Monte Carlo implementation was compared with measurements in homogeneous water phantom and inhomogeneous phantoms with slab lung or bone material for 4 square fields and 9 irregularly shaped fields. Results: The calculated output factors are compared with the measured ones and the difference is within 1% for different field sizes. The calculated dose distributions in the phantoms show good agreement with measurements using diode detector and films. The dose difference is within 2% inside the field and the distance to agreement is within 2mm in the penumbra region. The gamma passing rate is more than 95% with 2%/2mm criteria for all the test cases. Conclusion: Implementation of Monte Carlo dose calculation for a MLC equipped robotic radiosurgery system is completed successfully. The accuracy of Monte Carlo dose calculation with MLC is clinically acceptable. This work was supported by Accuray Inc.« less

  19. Monte Carlo method for photon heating using temperature-dependent optical properties.

    PubMed

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Convergence Time and Phase Transition in a Non-monotonic Family of Probabilistic Cellular Automata

    NASA Astrophysics Data System (ADS)

    Ramos, A. D.; Leite, A.

    2017-08-01

    In dynamical systems, some of the most important questions are related to phase transitions and convergence time. We consider a one-dimensional probabilistic cellular automaton where their components assume two possible states, zero and one, and interact with their two nearest neighbors at each time step. Under the local interaction, if the component is in the same state as its two neighbors, it does not change its state. In the other cases, a component in state zero turns into a one with probability α , and a component in state one turns into a zero with probability 1-β . For certain values of α and β , we show that the process will always converge weakly to δ 0, the measure concentrated on the configuration where all the components are zeros. Moreover, the mean time of this convergence is finite, and we describe an upper bound in this case, which is a linear function of the initial distribution. We also demonstrate an application of our results to the percolation PCA. Finally, we use mean-field approximation and Monte Carlo simulations to show coexistence of three distinct behaviours for some values of parameters α and β.

  1. 135La as an Auger-electron emitter for targeted internal radiotherapy

    NASA Astrophysics Data System (ADS)

    Fonslet, J.; Lee, B. Q.; Tran, T. A.; Siragusa, M.; Jensen, M.; Kibédi, T.; E Stuchbery, A.; Severin, G. W.

    2018-01-01

    135La has favorable nuclear and chemical properties for Auger-based targeted internal radiotherapy. Here we present detailed investigations of the production, emissions, and dosimetry related to 135La therapy. 135La was produced by 16.5 MeV proton irradiation of metallic natBa on a medical cyclotron, and was isolated and purified by trap-and-release on weak cation-exchange resin. The average production rate was 407  ±  19 MBq µA-1 (saturation activity), and the radionuclidic purity was 98% at 20 h post irradiation. Chemical separation recovered  >  98 % of the 135La with an effective molar activity of 70  ±  20 GBq µmol-1. To better assess cellular and organ dosimetry of this nuclide, we have calculated the x-ray and Auger emission spectra using a Monte Carlo model accounting for effects of multiple vacancies during the Auger cascade. The generated Auger spectrum was used to calculate cellular S-factors. 135La was produced with high specific activity, reactivity, radionuclidic purity, and yield. The emission spectrum and the dosimetry are favorable for internal radionuclide therapy.

  2. DESHARKY: automatic design of metabolic pathways for optimal cell growth.

    PubMed

    Rodrigo, Guillermo; Carrera, Javier; Prather, Kristala Jones; Jaramillo, Alfonso

    2008-11-01

    The biological solution for synthesis or remediation of organic compounds using living organisms, particularly bacteria and yeast, has been promoted because of the cost reduction with respect to the non-living chemical approach. In that way, computational frameworks can profit from the previous knowledge stored in large databases of compounds, enzymes and reactions. In addition, the cell behavior can be studied by modeling the cellular context. We have implemented a Monte Carlo algorithm (DESHARKY) that finds a metabolic pathway from a target compound by exploring a database of enzymatic reactions. DESHARKY outputs a biochemical route to the host metabolism together with its impact in the cellular context by using mathematical models of the cell resources and metabolism. Furthermore, we provide the sequence of amino acids for the enzymes involved in the route closest phylogenetically to the considered organism. We provide examples of designed metabolic pathways with their genetic load characterizations. Here, we have used Escherichia coli as host organism. In addition, our bioinformatic tool can be applied for biodegradation or biosynthesis and its performance scales with the database size. Software, a tutorial and examples are freely available and open source at http://soft.synth-bio.org/desharky.html

  3. An efficient Monte Carlo-based algorithm for scatter correction in keV cone-beam CT

    NASA Astrophysics Data System (ADS)

    Poludniowski, G.; Evans, P. M.; Hansen, V. N.; Webb, S.

    2009-06-01

    A new method is proposed for scatter-correction of cone-beam CT images. A coarse reconstruction is used in initial iteration steps. Modelling of the x-ray tube spectra and detector response are included in the algorithm. Photon diffusion inside the imaging subject is calculated using the Monte Carlo method. Photon scoring at the detector is calculated using forced detection to a fixed set of node points. The scatter profiles are then obtained by linear interpolation. The algorithm is referred to as the coarse reconstruction and fixed detection (CRFD) technique. Scatter predictions are quantitatively validated against a widely used general-purpose Monte Carlo code: BEAMnrc/EGSnrc (NRCC, Canada). Agreement is excellent. The CRFD algorithm was applied to projection data acquired with a Synergy XVI CBCT unit (Elekta Limited, Crawley, UK), using RANDO and Catphan phantoms (The Phantom Laboratory, Salem NY, USA). The algorithm was shown to be effective in removing scatter-induced artefacts from CBCT images, and took as little as 2 min on a desktop PC. Image uniformity was greatly improved as was CT-number accuracy in reconstructions. This latter improvement was less marked where the expected CT-number of a material was very different to the background material in which it was embedded.

  4. Theory for the three-dimensional Mercedes-Benz model of water.

    PubMed

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A

    2009-11-21

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  5. Theory for the three-dimensional Mercedes-Benz model of water

    PubMed Central

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-01-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the “right answer,” we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim’s Ornstein–Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation. PMID:19929057

  6. Theory for the three-dimensional Mercedes-Benz model of water

    NASA Astrophysics Data System (ADS)

    Bizjak, Alan; Urbic, Tomaz; Vlachy, Vojko; Dill, Ken A.

    2009-11-01

    The two-dimensional Mercedes-Benz (MB) model of water has been widely studied, both by Monte Carlo simulations and by integral equation methods. Here, we study the three-dimensional (3D) MB model. We treat water as spheres that interact through Lennard-Jones potentials and through a tetrahedral Gaussian hydrogen bonding function. As the "right answer," we perform isothermal-isobaric Monte Carlo simulations on the 3D MB model for different pressures and temperatures. The purpose of this work is to develop and test Wertheim's Ornstein-Zernike integral equation and thermodynamic perturbation theories. The two analytical approaches are orders of magnitude more efficient than the Monte Carlo simulations. The ultimate goal is to find statistical mechanical theories that can efficiently predict the properties of orientationally complex molecules, such as water. Also, here, the 3D MB model simply serves as a useful workbench for testing such analytical approaches. For hot water, the analytical theories give accurate agreement with the computer simulations. For cold water, the agreement is not as good. Nevertheless, these approaches are qualitatively consistent with energies, volumes, heat capacities, compressibilities, and thermal expansion coefficients versus temperature and pressure. Such analytical approaches offer a promising route to a better understanding of water and also the aqueous solvation.

  7. [Evaluation of Organ Dose Estimation from Indices of CT Dose Using Dose Index Registry].

    PubMed

    Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio

    Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, dedicated software is too expensive for small scale hospitals. Not every hospital can estimate organ dose with dedicated software. The purpose of this study was to evaluate the simple method of organ dose estimation using some common indices of CT dose. The Monte Carlo simulation software Radimetrics (Bayer) was used for calculating organ dose and analysis relationship between indices of CT dose and organ dose. Multidetector CT scanners were compared with those from two manufactures (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). Using stored patient data from Radimetrics, the relationships between indices of CT dose and organ dose were indicated as each formula for estimating organ dose. The accuracy of estimation method of organ dose was compared with the results of Monte Carlo simulation using the Bland-Altman plots. In the results, SSDE was the feasible index for estimation organ dose in almost organs because it reflected each patient size. The differences of organ dose between estimation and simulation were within 23%. In conclusion, our estimation method of organ dose using indices of CT dose is convenient for clinical with accuracy.

  8. Development of reversible jump Markov Chain Monte Carlo algorithm in the Bayesian mixture modeling for microarray data in Indonesia

    NASA Astrophysics Data System (ADS)

    Astuti, Ani Budi; Iriawan, Nur; Irhamah, Kuswanto, Heri

    2017-12-01

    In the Bayesian mixture modeling requires stages the identification number of the most appropriate mixture components thus obtained mixture models fit the data through data driven concept. Reversible Jump Markov Chain Monte Carlo (RJMCMC) is a combination of the reversible jump (RJ) concept and the Markov Chain Monte Carlo (MCMC) concept used by some researchers to solve the problem of identifying the number of mixture components which are not known with certainty number. In its application, RJMCMC using the concept of the birth/death and the split-merge with six types of movement, that are w updating, θ updating, z updating, hyperparameter β updating, split-merge for components and birth/death from blank components. The development of the RJMCMC algorithm needs to be done according to the observed case. The purpose of this study is to know the performance of RJMCMC algorithm development in identifying the number of mixture components which are not known with certainty number in the Bayesian mixture modeling for microarray data in Indonesia. The results of this study represent that the concept RJMCMC algorithm development able to properly identify the number of mixture components in the Bayesian normal mixture model wherein the component mixture in the case of microarray data in Indonesia is not known for certain number.

  9. Bayes factors for the linear ballistic accumulator model of decision-making.

    PubMed

    Evans, Nathan J; Brown, Scott D

    2018-04-01

    Evidence accumulation models of decision-making have led to advances in several different areas of psychology. These models provide a way to integrate response time and accuracy data, and to describe performance in terms of latent cognitive processes. Testing important psychological hypotheses using cognitive models requires a method to make inferences about different versions of the models which assume different parameters to cause observed effects. The task of model-based inference using noisy data is difficult, and has proven especially problematic with current model selection methods based on parameter estimation. We provide a method for computing Bayes factors through Monte-Carlo integration for the linear ballistic accumulator (LBA; Brown and Heathcote, 2008), a widely used evidence accumulation model. Bayes factors are used frequently for inference with simpler statistical models, and they do not require parameter estimation. In order to overcome the computational burden of estimating Bayes factors via brute force integration, we exploit general purpose graphical processing units; we provide free code for this. This approach allows estimation of Bayes factors via Monte-Carlo integration within a practical time frame. We demonstrate the method using both simulated and real data. We investigate the stability of the Monte-Carlo approximation, and the LBA's inferential properties, in simulation studies.

  10. Comparative analysis of 11 different radioisotopes for palliative treatment of bone metastases by computational methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerra Liberal, Francisco D. C., E-mail: meb12020@fe.up.pt, E-mail: adriana-tavares@msn.com; Tavares, Adriana Alexandre S., E-mail: meb12020@fe.up.pt, E-mail: adriana-tavares@msn.com; Tavares, João Manuel R. S., E-mail: tavares@fe.up.pt

    Purpose: Throughout the years, the palliative treatment of bone metastases using bone seeking radiotracers has been part of the therapeutic resources used in oncology, but the choice of which bone seeking agent to use is not consensual across sites and limited data are available comparing the characteristics of each radioisotope. Computational simulation is a simple and practical method to study and to compare a variety of radioisotopes for different medical applications, including the palliative treatment of bone metastases. This study aims to evaluate and compare 11 different radioisotopes currently in use or under research for the palliative treatment of bonemore » metastases using computational methods. Methods: Computational models were used to estimate the percentage of deoxyribonucleic acid (DNA) damage (fast Monte Carlo damage algorithm), the probability of correct DNA repair (Monte Carlo excision repair algorithm), and the radiation-induced cellular effects (virtual cell radiobiology algorithm) post-irradiation with selected particles emitted by phosphorus-32 ({sup 32}P), strontium-89 ({sup 89}Sr), yttrium-90 ({sup 90}Y ), tin-117 ({sup 117m}Sn), samarium-153 ({sup 153}Sm), holmium-166 ({sup 166}Ho), thulium-170 ({sup 170}Tm), lutetium-177 ({sup 177}Lu), rhenium-186 ({sup 186}Re), rhenium-188 ({sup 188}Re), and radium-223 ({sup 223}Ra). Results: {sup 223}Ra alpha particles, {sup 177}Lu beta minus particles, and {sup 170}Tm beta minus particles induced the highest cell death of all investigated particles and radioisotopes. The cell survival fraction measured post-irradiation with beta minus particles emitted by {sup 89}Sr and {sup 153}Sm, two of the most frequently used radionuclides in the palliative treatment of bone metastases in clinical routine practice, was higher than {sup 177}Lu beta minus particles and {sup 223}Ra alpha particles. Conclusions: {sup 223}Ra and {sup 177}Lu hold the highest potential for palliative treatment of bone metastases of all radioisotopes compared in this study. Data reported here may prompt future in vitro and in vivo experiments comparing different radionuclides for palliative treatment of bone metastases, raise the need for the careful rethinking of the current widespread clinical use of {sup 89}Sr and {sup 153}Sm, and perhaps strengthen the use of {sup 223}Ra and {sup 177}Lu in the palliative treatment of bone metastases.« less

  11. SU-E-T-110: Development of An Independent, Monte Carlo, Dose Calculation, Quality Assurance Tool for Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, A; University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, TX; Davidson, S

    2014-06-01

    Purpose: To develop a comprehensive end-to-end test for Varian's TrueBeam linear accelerator for head and neck IMRT using a custom phantom designed to utilize multiple dosimetry devices. Purpose: To commission a multiple-source Monte Carlo model of Elekta linear accelerator beams of nominal energies 6MV and 10MV. Methods: A three source, Monte Carlo model of Elekta 6 and 10MV therapeutic x-ray beams was developed. Energy spectra of two photon sources corresponding to primary photons created in the target and scattered photons originating in the linear accelerator head were determined by an optimization process that fit the relative fluence of 0.25 MeVmore » energy bins to the product of Fatigue-Life and Fermi functions to match calculated percent depth dose (PDD) data with that measured in a water tank for a 10x10cm2 field. Off-axis effects were modeled by a 3rd degree polynomial used to describe the off-axis half-value layer as a function of off-axis angle and fitting the off-axis fluence to a piecewise linear function to match calculated dose profiles with measured dose profiles for a 40×40cm2 field. The model was validated by comparing calculated PDDs and dose profiles for field sizes ranging from 3×3cm2 to 30×30cm2 to those obtained from measurements. A benchmarking study compared calculated data to measurements for IMRT plans delivered to anthropomorphic phantoms. Results: Along the central axis of the beam 99.6% and 99.7% of all data passed the 2%/2mm gamma criterion for 6 and 10MV models, respectively. Dose profiles at depths of dmax, through 25cm agreed with measured data for 99.4% and 99.6% of data tested for 6 and 10MV models, respectively. A comparison of calculated dose to film measurement in a head and neck phantom showed an average of 85.3% and 90.5% of pixels passing a 3%/2mm gamma criterion for 6 and 10MV models respectively. Conclusion: A Monte Carlo multiple-source model for Elekta 6 and 10MV therapeutic x-ray beams has been developed as a quality assurance tool for clinical trials.« less

  12. Communities of Practice or Communities of Coping?: Employee Compliance among CSRs in Israeli Call Centres

    ERIC Educational Resources Information Center

    Raz, Aviad E.

    2007-01-01

    Purpose: The purpose of this paper is to describe and analyse the formation of CoPs (communities of practice) in three call centres of cellular communication operating companies in Israel. Design/methodology/approach: This study is based on a qualitative methodology including observations, interviews and textual analysis. Findings: In all three…

  13. A comprehensive review of metal-induced cellular transformation studies.

    PubMed

    Chen, Qiao Yi; Costa, Max

    2017-09-15

    In vitro transformation assays not only serve practical purposes in screening for potential carcinogenic substances in food, drug, and cosmetic industries, but more importantly, they provide a means of understanding the critical biological processes behind in vivo cancer development. In resemblance to cancer cells in vivo, successfully transformed cells display loss of contact inhibition, gain of anchorage independent growth, resistant to proper cell cycle regulation such as apoptosis, faster proliferation rate, potential for cellular invasion, and ability to form tumors in experimental animals. Cells purposely transformed using metal exposures enable researchers to examine molecular changes, dissect various stages of tumor formation, and ultimately elucidate metal induced cancer mode of action. For practical purposes, this review specifically focuses on studies incorporating As-, Cd-, Cr-, and Ni-induced cell transformation. Through investigating and comparing an extensive list of studies using various methods of metal-induced transformation, this review serves to bridge an information gap and provide a guide for avoiding procedural discrepancies as well as maximizing experimental efficiency. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. qpure: A Tool to Estimate Tumor Cellularity from Genome-Wide Single-Nucleotide Polymorphism Profiles

    PubMed Central

    Song, Sarah; Nones, Katia; Miller, David; Harliwong, Ivon; Kassahn, Karin S.; Pinese, Mark; Pajic, Marina; Gill, Anthony J.; Johns, Amber L.; Anderson, Matthew; Holmes, Oliver; Leonard, Conrad; Taylor, Darrin; Wood, Scott; Xu, Qinying; Newell, Felicity; Cowley, Mark J.; Wu, Jianmin; Wilson, Peter; Fink, Lynn; Biankin, Andrew V.; Waddell, Nic; Grimmond, Sean M.; Pearson, John V.

    2012-01-01

    Tumour cellularity, the relative proportion of tumour and normal cells in a sample, affects the sensitivity of mutation detection, copy number analysis, cancer gene expression and methylation profiling. Tumour cellularity is traditionally estimated by pathological review of sectioned specimens; however this method is both subjective and prone to error due to heterogeneity within lesions and cellularity differences between the sample viewed during pathological review and tissue used for research purposes. In this paper we describe a statistical model to estimate tumour cellularity from SNP array profiles of paired tumour and normal samples using shifts in SNP allele frequency at regions of loss of heterozygosity (LOH) in the tumour. We also provide qpure, a software implementation of the method. Our experiments showed that there is a medium correlation 0.42 (-value = 0.0001) between tumor cellularity estimated by qpure and pathology review. Interestingly there is a high correlation 0.87 (-value 2.2e-16) between cellularity estimates by qpure and deep Ion Torrent sequencing of known somatic KRAS mutations; and a weaker correlation 0.32 (-value = 0.004) between IonTorrent sequencing and pathology review. This suggests that qpure may be a more accurate predictor of tumour cellularity than pathology review. qpure can be downloaded from https://sourceforge.net/projects/qpure/. PMID:23049875

  15. Stochastic hybrid systems for studying biochemical processes.

    PubMed

    Singh, Abhyudai; Hespanha, João P

    2010-11-13

    Many protein and mRNA species occur at low molecular counts within cells, and hence are subject to large stochastic fluctuations in copy numbers over time. Development of computationally tractable frameworks for modelling stochastic fluctuations in population counts is essential to understand how noise at the cellular level affects biological function and phenotype. We show that stochastic hybrid systems (SHSs) provide a convenient framework for modelling the time evolution of population counts of different chemical species involved in a set of biochemical reactions. We illustrate recently developed techniques that allow fast computations of the statistical moments of the population count, without having to run computationally expensive Monte Carlo simulations of the biochemical reactions. Finally, we review different examples from the literature that illustrate the benefits of using SHSs for modelling biochemical processes.

  16. Multiscale dynamics of biological cells with chemotactic interactions: From a discrete stochastic model to a continuous description

    NASA Astrophysics Data System (ADS)

    Alber, Mark; Chen, Nan; Glimm, Tilmann; Lushnikov, Pavel M.

    2006-05-01

    The cellular Potts model (CPM) has been used for simulating various biological phenomena such as differential adhesion, fruiting body formation of the slime mold Dictyostelium discoideum, angiogenesis, cancer invasion, chondrogenesis in embryonic vertebrate limbs, and many others. We derive a continuous limit of a discrete one-dimensional CPM with the chemotactic interactions between cells in the form of a Fokker-Planck equation for the evolution of the cell probability density function. This equation is then reduced to the classical macroscopic Keller-Segel model. In particular, all coefficients of the Keller-Segel model are obtained from parameters of the CPM. Theoretical results are verified numerically by comparing Monte Carlo simulations for the CPM with numerics for the Keller-Segel model.

  17. SU-E-T-344: Validation and Clinical Experience of Eclipse Electron Monte Carlo Algorithm (EMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokharel, S; Rana, S

    2014-06-01

    Purpose: The purpose of this study is to validate Eclipse Electron Monte Carlo (Algorithm for routine clinical uses. Methods: The PTW inhomogeneity phantom (T40037) with different combination of heterogeneous slabs has been CT-scanned with Philips Brilliance 16 slice scanner. The phantom contains blocks of Rando Alderson materials mimicking lung, Polystyrene (Tissue), PTFE (Bone) and PMAA. The phantom has 30×30×2.5 cm base plate with 2cm recesses to insert inhomogeneity. The detector systems used in this study are diode, tlds and Gafchromic EBT2 films. The diode and tlds were included in CT scans. The CT sets are transferred to Eclipse treatment planningmore » system. Several plans have been created with Eclipse Monte Carlo (EMC) algorithm 11.0.21. Measurements have been carried out in Varian TrueBeam machine for energy from 6–22mev. Results: The measured and calculated doses agreed very well for tissue like media. The agreement was reasonably okay for the presence of lung inhomogeneity. The point dose agreement was within 3.5% and Gamma passing rate at 3%/3mm was greater than 93% except for 6Mev(85%). The disagreement can reach as high as 10% in the presence of bone inhomogeneity. This is due to eclipse reporting dose to the medium as opposed to the dose to the water as in conventional calculation engines. Conclusion: Care must be taken when using Varian Eclipse EMC algorithm for dose calculation for routine clinical uses. The algorithm dose not report dose to water in which most of the clinical experiences are based on rather it just reports dose to medium directly. In the presence of inhomogeneity such as bone, the dose discrepancy can be as high as 10% or even more depending on the location of normalization point or volume. As Radiation oncology as an empirical science, care must be taken before using EMC reported monitor units for clinical uses.« less

  18. SU-F-T-656: Monte Carlo Study On Air Activation Around a Medical Electron Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horst, F; GSI Helmholtz Centre for Heavy Ion Research, Darmstadt; Fehrenbacher, G

    Purpose: In high energy photon therapy, several radiation protection issues result from photonuclear reactions. The activation of air - directly by photonuclear reactions as well as indirectly by capture of photoneutrons generated inside the linac head - is a major point of concern for the medical staff. The purpose of this study was to estimate the annual effective dose to medical workers due to activated air around a medical high energy electron linac by means of Monte Carlo simulations. Methods: The treatment head of a Varian Clinac in 18 MV-X mode as well as the surrounding concrete bunker were modeledmore » and the radiation transport was simulated using the Monte Carlo code FLUKA, starting from the primary electron striking the bremsstrahlung target. The activation yields in air from photo-disintegration of O-16 and N-14 nuclei as well as from neutron capture on Ar-40 nuclei were obtained from the simulations. The activation build-up, radioactive decay and air ventilation were studied using a mathematical model. The annual effective dose to workers was estimated by using published isotope specific conversion factors. Results: The oxygen and nitrogen activation yields were in contrast to the argon activation yield found to be field size dependent. The impact of the treatment room ventilation on the different air activation products was investigated and quantified. An estimate with very conservative assumptions gave an annual effective dose to workers of < 1 mSv/a. Conclusion: From the results of this study it can be concluded that the contribution of air activation to the radiation exposure to medical workers should be negligible in modern photon therapy, especially when it is compared to the dose due to prompt neutrons and the activation of heavy solid materials such as the jaws and the collimators inside the linac head.« less

  19. Theoretical substantiation of biological efficacy enhancement for β-delayed particle decay {sup 9}C beam: A Monte Carlo study in combination with analysis with the local effect model approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Liheng; Yan, Yuanlin; Ma, Yuanyuan

    Purpose: To improve the efficacy of heavy ion therapy, β-delayed particle decay {sup 9}C beam as a double irradiation source for cancer therapy has been proposed. The authors’ previous experiment showed that relative biological effectiveness (RBE) values at the depths around the Bragg peak of a {sup 9}C beam were enhanced and compared to its stable counterpart {sup 12}C beam. The purpose of this study was to explore the nature of the biological efficacy enhancement theoretically. Methods: A Monte Carlo simulation study was conducted in this study. First a simplified cell model was established so as to form a tumormore » tissue. Subsequently, the tumor tissue was imported into the Monte Carlo simulation software package GATE and then the tumor cells were virtually irradiated with comparable {sup 9}C and {sup 12}C beams, respectively, in the simulations. The transportation and particle deposition data of the {sup 9}C and {sup 12}C beams, derived from the GATE simulations, were analyzed with the authors’ local effect model implementation so as to deduce cell survival fractions. Results: The particles emitted from the decay process of deposited {sup 9}C particles around a cell nucleus increased the dose delivered to the nucleus and elicited clustered damages around the secondary particles’ trajectories. Therefore, compared to the {sup 12}C beam, the RBE value of the {sup 9}C beam increased at the depths around their Bragg peaks. Conclusions: Collectively, the increased local doses and clustered damages due to the decayed particles emitted from deposited {sup 9}C particles led to the RBE enhancement in contrast with the {sup 12}C beam. Thus, the enhanced RBE effect of a {sup 9}C beam for a simplified tumor model was shown theoretically in this study.« less

  20. TH-AB-BRA-07: PENELOPE-Based GPU-Accelerated Dose Calculation System Applied to MRI-Guided Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: We first translated PENELOPE from FORTRAN to C++ and validated that the translation produced equivalent results. Then we adapted the C++ code to CUDA in a workflow optimized for GPU architecture. We expanded upon the original code to include voxelized transportmore » boosted by Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, we incorporated the vendor-provided MRIdian head model into the code. We performed a set of experimental measurements on MRIdian to examine the accuracy of both the head model and gPENELOPE, and then applied gPENELOPE toward independent validation of patient doses calculated by MRIdian’s KMC. Results: We achieve an average acceleration factor of 152 compared to the original single-thread FORTRAN implementation with the original accuracy preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen (1), mediastinum (1) and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: We developed a Monte Carlo simulation platform based on a GPU-accelerated version of PENELOPE. We validated that both the vendor provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  1. Modeling the Production of Beta-Delayed Gamma Rays for the Detection of Special Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, J M; Pruet, J A; Brown, D A

    2005-02-14

    The objective of this LDRD project was to develop one or more models for the production of {beta}-delayed {gamma} rays following neutron-induced fission of a special nuclear material (SNM) and to define a standardized formatting scheme which will allow them to be incorporated into some of the modern, general-purpose Monte Carlo transport codes currently being used to simulate inspection techniques proposed for detecting fissionable material hidden in sea-going cargo containers. In this report, we will describe a Monte Carlo model for {beta}-delayed {gamma}-ray emission following the fission of SNM that can accommodate arbitrary time-dependent fission rates and photon collection histories.more » The model involves direct sampling of the independent fission yield distributions of the system, the branching ratios for decay of individual fission products and spectral distributions representing photon emission from each fission product and for each decay mode. While computationally intensive, it will be shown that this model can provide reasonably detailed estimates of the spectra that would be recorded by an arbitrary spectrometer and may prove quite useful in assessing the quality of evaluated data libraries and identifying gaps in the libraries. The accuracy of the model will be illustrated by comparing calculated and experimental spectra from the decay of short-lived fission products following the reactions {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f). For general-purpose transport calculations, where a detailed consideration of the large number of individual {gamma}-ray transitions in a spectrum may not be necessary, it will be shown that a simple parameterization of the {gamma}-ray source function can be defined which provides high-quality average spectral distributions that should suffice for calculations describing photons being transported through thick attenuating media. Finally, a proposal for ENDF-compatible formats that describe each of the models and allow for their straightforward use in Monte Carlo codes will be presented.« less

  2. SU-F-T-125: Radial Dose Distributions From Carbon Ions of Therapeutic Energies Calculated with Geant4-DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassiliev, O

    Purpose: Radial dose distribution D(r) is the dose as a function of lateral distance from the path of a heavy charged particle. Its main application is in modelling of biological effects of heavy ions, including applications to hadron therapy. It is the main physical parameter of a broad group of radiobiological models known as the amorphous track models. Our purpose was to calculate D(r) with Monte Carlo for carbon ions of therapeutic energies, find a simple formula for D(r) and fit it to the Monte Carlo data. Methods: All calculations were performed with Geant4-DNA code, for carbon ion energies frommore » 10 to 400 MeV/u (ranges in water: ∼ 0.4 mm to 27 cm). The spatial resolution of dose distribution in the lateral direction was 1 nm. Electron tracking cut off energy was 11 eV (ionization threshold). The maximum lateral distance considered was 10 µm. Over this distance, D(r) decreases with distance by eight orders of magnitude. Results: All calculated radial dose distributions had a similar shape dominated by the well-known inverse square dependence on the distance. Deviations from the inverse square law were observed close to the beam path (r<10 nm) and at large distances (r >1 µm). At small and large distances D(r) decreased, respectively, slower and faster than the inverse square of distance. A formula for D(r) consistent with this behavior was found and fitted to the Monte Carlo data. The accuracy of the fit was better than 10% for all distances considered. Conclusion: We have generated a set of radial dose distributions for carbon ions that covers the entire range of therapeutic energies, for distances from the ion path of up to 10 µm. The latter distance is sufficient for most applications because dose beyond 10 µm is extremely low.« less

  3. SU-F-T-166: On the Nature of the Background Visible Light Observed in Fiber Optic Dosimetry of Proton Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darafsheh, A; Kassaee, A; Finlay, J

    Purpose: The nature of the background visible light observed during fiber optic dosimetry of proton beams, whether it is due to Cherenkov radiation or not, has been debated in the literature recently. In this work, experimentally and by means of Monte Carlo simulations, we shed light on this problem and investigated the nature of the background visible light observed in fiber optics irradiated with proton beams. Methods: A bare silica fiber optics was embedded in tissue-mimicking phantoms and irradiated with clinical proton beams with energies of 100–225 MeV at Roberts Proton Therapy Center. Luminescence spectroscopy was performed by a CCD-coupledmore » spectrograph to analyze in detail the emission spectrum of the fiber tip across the visible range of 400–700 nm. Monte Carlo simulation was performed by using FLUKA Monte Carlo code to simulate Cherenkov light and ionizing radiation dose deposition in the fiber. Results: The experimental spectra of the irradiated silica fiber shows two distinct peaks at 450 and 650 nm, whose spectral shape is different from that of Cherenkov radiation. We believe that the nature of these peaks are connected to the point defects of silica including oxygen-deficiency center (ODC) and non-bridging oxygen hole center (NBOHC). Monte Carlo simulations confirmed the experimental observations that Cherenkov radiation cannot be solely responsible for such a signal. Conclusion: We showed that Cherenkov radiation is not the dominant visible signal observed in bare fiber optics irradiated with proton beams. We observed two distinct peaks at 450 and 650 nm whose nature is connected with the point defects of silica fiber including oxygen-deficiency center and non-bridging oxygen hole center.« less

  4. The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.

    2014-02-01

    A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.

  5. Dosimetric parameters of three new solid core I‐125 brachytherapy sources

    PubMed Central

    Solberg, Timothy D.; DeMarco, John J.; Hugo, Geoffrey; Wallace, Robert E.

    2002-01-01

    Monte Carlo calculations and TLD measurements have been performed for the purpose of characterizing dosimetric properties of new commercially available brachytherapy sources. All sources tested consisted of a solid core, upon which a thin layer of I125 has been adsorbed, encased within a titanium housing. The PharmaSeed BT‐125 source manufactured by Syncor is available in silver or palladium core configurations while the ADVANTAGE source from IsoAid has silver only. Dosimetric properties, including the dose rate constant, radial dose function, and anisotropy characteristics were determined according to the TG‐43 protocol. Additionally, the geometry function was calculated exactly using Monte Carlo and compared with both the point and line source approximations. The 1999 NIST standard was followed in determining air kerma strength. Dose rate constants were calculated to be 0.955±0.005,0.967±0.005, and 0.962±0.005 cGyh−1U−1 for the PharmaSeed BT‐125‐1, BT‐125‐2, and ADVANTAGE sources, respectively. TLD measurements were in excellent agreement with Monte Carlo calculations. Radial dose function, g(r), calculated to a distance of 10 cm, and anisotropy function F(r, θ), calculated for radii from 0.5 to 7.0 cm, were similar among all source configurations. Anisotropy constants, ϕ¯an, were calculated to be 0.941, 0.944, and 0.960 for the three sources, respectively. All dosimetric parameters were found to be in close agreement with previously published data for similar source configurations. The MCNP Monte Carlo code appears to be ideally suited to low energy dosimetry applications. PACS number(s): 87.53.–j PMID:11958652

  6. SU-E-T-455: Characterization of 3D Printed Materials for Proton Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, W; Siderits, R; McKenna, M

    2014-06-01

    Purpose: The widespread availability of low cost 3D printing technologies provides an alternative fabrication method for customized proton range modifying accessories such as compensators and boluses. However the material properties of the printed object are dependent on the printing technology used. In order to facilitate the application of 3D printing in proton therapy, this study investigated the stopping power of several printed materials using both proton pencil beam measurements and Monte Carlo simulations. Methods: Five 3–4 cm cubes fabricated using three 3D printing technologies (selective laser sintering, fused-deposition modeling and stereolithography) from five printers were investigated. The cubes were scannedmore » on a CT scanner and the depth dose curves for a mono-energetic pencil beam passing through the material were measured using a large parallel plate ion chamber in a water tank. Each cube was measured from two directions (perpendicular and parallel to printing plane) to evaluate the effects of the anisotropic material layout. The results were compared with GEANT4 Monte Carlo simulation using the manufacturer specified material density and chemical composition data. Results: Compared with water, the differences from the range pull back by the printed blocks varied and corresponded well with the material CT Hounsfield unit. The measurement results were in agreement with Monte Carlo simulation. However, depending on the technology, inhomogeneity existed in the printed cubes evidenced from CT images. The effect of such inhomogeneity on the proton beam is to be investigated. Conclusion: Printed blocks by three different 3D printing technologies were characterized for proton beam with measurements and Monte Carlo simulation. The effects of the printing technologies in proton range and stopping power were studied. The derived results can be applied when specific devices are used in proton radiotherapy.« less

  7. EVALUATING THE SENSITIVITY OF RADIONUCLIDE DETECTORS FOR CONDUCTING A MARITIME ON-BOARD SEARCH USING MONTE CARLO SIMULATION IMPLEMENTED IN AVERT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S; Dave Dunn, D

    The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation implemented in AVERT{reg_sign}. AVERT{reg_sign}, short for the Automated Vulnerability Evaluation for Risk of Terrorism, is personal computer based vulnerability assessment software developed by the ARES Corporation. The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation. The detectors, a RadPack and also a Personal Radiation Detector (PRD), were chosen from the class of Human Portable Radiation Detection Systems (HPRDS). Human Portable Radiationmore » Detection Systems (HPRDS) serve multiple purposes. In the maritime environment, there is a need to detect, localize, characterize, and identify radiological/nuclear (RN) material or weapons. The RadPack is a commercially available broad-area search device used for gamma and also for neutron detection. The PRD is chiefly used as a personal radiation protection device. It is also used to detect contraband radionuclides and to localize radionuclide sources. Neither device has the capacity to characterize or identify radionuclides. The principal aim of this study was to investigate the sensitivity of both the RadPack and the PRD while being used under controlled conditions in a simulated maritime environment for detecting hidden RN contraband. The detection distance varies by the source strength and the shielding present. The characterization parameters of the source are not indicated in this report so the results summarized are relative. The Monte Carlo simulation results indicate the probability of detection of the RN source at certain distances from the detector which is a function of transverse speed and instrument sensitivity for the specified RN source.« less

  8. Monte Carlo simulation of x-ray spectra in diagnostic radiology and mammography using MCNP4C

    NASA Astrophysics Data System (ADS)

    Ay, M. R.; Shahriari, M.; Sarkar, S.; Adib, M.; Zaidi, H.

    2004-11-01

    The general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C) was used for the simulation of x-ray spectra in diagnostic radiology and mammography. The electrons were transported until they slow down and stop in the target. Both bremsstrahlung and characteristic x-ray production were considered in this work. We focus on the simulation of various target/filter combinations to investigate the effect of tube voltage, target material and filter thickness on x-ray spectra in the diagnostic radiology and mammography energy ranges. The simulated x-ray spectra were compared with experimental measurements and spectra calculated by IPEM report number 78. In addition, the anode heel effect and off-axis x-ray spectra were assessed for different anode angles and target materials and the results were compared with EGS4-based Monte Carlo simulations and measured data. Quantitative evaluation of the differences between our Monte Carlo simulated and comparison spectra was performed using student's t-test statistical analysis. Generally, there is a good agreement between the simulated x-ray and comparison spectra, although there are systematic differences between the simulated and reference spectra especially in the K-characteristic x-rays intensity. Nevertheless, no statistically significant differences have been observed between IPEM spectra and the simulated spectra. It has been shown that the difference between MCNP simulated spectra and IPEM spectra in the low energy range is the result of the overestimation of characteristic photons following the normalization procedure. The transmission curves produced by MCNP4C have good agreement with the IPEM report especially for tube voltages of 50 kV and 80 kV. The systematic discrepancy for higher tube voltages is the result of systematic differences between the corresponding spectra.

  9. Cellular responses to recurrent pentylenetetrazole-induced seizures in the adult zebrafish brain

    PubMed Central

    Duy, Phan Q; Berberoglu, Michael A; Beattie, Christine E; Hall, Charles W

    2017-01-01

    A seizure is a sustained increase in brain electrical activity that can result in loss of consciousness and injury. Understanding how the brain responds to seizures is important for development of new treatment strategies for epilepsy, a neurological condition characterized by recurrent and unprovoked seizures. Pharmacological induction of seizures in rodent models results in a myriad of cellular alterations, including inflammation, angiogenesis, and adult neurogenesis. The purpose of this study is to investigate the cellular responses to recurrent pentylenetetrazole seizures in the adult zebrafish brain. We subjected zebrafish to five once daily pentylenetetrazole induced seizures and characterized the cellular consequences of these seizures. In response to recurrent seizures, we found histologic evidence of vasodilatation, perivascular leukocyte egress and leukocyte proliferation suggesting seizure-induced acute CNS inflammation. We also found evidence of increased proliferation, neurogenesis, and reactive gliosis. Collectively, our results suggest that the cellular responses to seizures in the adult zebrafish brain are similar to those observed in mammalian brains. PMID:28238851

  10. Landauer in the Age of Synthetic Biology: Energy Consumption and Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Mehta, Pankaj; Lang, Alex H.; Schwab, David J.

    2016-03-01

    A central goal of synthetic biology is to design sophisticated synthetic cellular circuits that can perform complex computations and information processing tasks in response to specific inputs. The tremendous advances in our ability to understand and manipulate cellular information processing networks raises several fundamental physics questions: How do the molecular components of cellular circuits exploit energy consumption to improve information processing? Can one utilize ideas from thermodynamics to improve the design of synthetic cellular circuits and modules? Here, we summarize recent theoretical work addressing these questions. Energy consumption in cellular circuits serves five basic purposes: (1) increasing specificity, (2) manipulating dynamics, (3) reducing variability, (4) amplifying signal, and (5) erasing memory. We demonstrate these ideas using several simple examples and discuss the implications of these theoretical ideas for the emerging field of synthetic biology. We conclude by discussing how it may be possible to overcome these limitations using "post-translational" synthetic biology that exploits reversible protein modification.

  11. TU-F-17A-08: The Relative Accuracy of 4D Dose Accumulation for Lung Radiotherapy Using Rigid Dose Projection Versus Dose Recalculation On Every Breathing Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamb, J; Lee, C; Tee, S

    2014-06-15

    Purpose: To investigate the accuracy of 4D dose accumulation using projection of dose calculated on the end-exhalation, mid-ventilation, or average intensity breathing phase CT scan, versus dose accumulation performed using full Monte Carlo dose recalculation on every breathing phase. Methods: Radiotherapy plans were analyzed for 10 patients with stage I-II lung cancer planned using 4D-CT. SBRT plans were optimized using the dose calculated by a commercially-available Monte Carlo algorithm on the end-exhalation 4D-CT phase. 4D dose accumulations using deformable registration were performed with a commercially available tool that projected the planned dose onto every breathing phase without recalculation, as wellmore » as with a Monte Carlo recalculation of the dose on all breathing phases. The 3D planned dose (3D-EX), the 3D dose calculated on the average intensity image (3D-AVE), and the 4D accumulations of the dose calculated on the end-exhalation phase CT (4D-PR-EX), the mid-ventilation phase CT (4D-PR-MID), and the average intensity image (4D-PR-AVE), respectively, were compared against the accumulation of the Monte Carlo dose recalculated on every phase. Plan evaluation metrics relating to target volumes and critical structures relevant for lung SBRT were analyzed. Results: Plan evaluation metrics tabulated using 4D-PR-EX, 4D-PR-MID, and 4D-PR-AVE differed from those tabulated using Monte Carlo recalculation on every phase by an average of 0.14±0.70 Gy, - 0.11±0.51 Gy, and 0.00±0.62 Gy, respectively. Deviations of between 8 and 13 Gy were observed between the 4D-MC calculations and both 3D methods for the proximal bronchial trees of 3 patients. Conclusions: 4D dose accumulation using projection without re-calculation may be sufficiently accurate compared to 4D dose accumulated from Monte Carlo recalculation on every phase, depending on institutional protocols. Use of 4D dose accumulation should be considered when evaluating normal tissue complication probabilities as well as in clinical situations where target volumes are directly inferior to mobile critical structures.« less

  12. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norris, Edward T.; Liu, Xin, E-mail: xinliu@mst.edu; Hsieh, Jiang

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. Themore » CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer. Conclusions: The simulation results showed that the deterministic method can be effectively used to estimate the absorbed dose in a CTDI phantom. The accuracy of the discrete ordinates method was close to that of a Monte Carlo simulation, and the primary benefit of the discrete ordinates method lies in its rapid computation speed. It is expected that further optimization of this method in routine clinical CT dose estimation will improve its accuracy and speed.« less

  13. A new combined approach on Hurst exponent estimate and its applications in realized volatility

    NASA Astrophysics Data System (ADS)

    Luo, Yi; Huang, Yirong

    2018-02-01

    The purpose of this paper is to propose a new estimator of Hurst exponent based on the combined information of the conventional rescaled range methods. We demonstrate the superiority of the proposed estimator by Monte Carlo simulations, and the applications in estimating the Hurst exponent of daily volatility series in Chinese stock market. Moreover, we indicate the impact of the type of estimator and structural break on the estimating results of Hurst exponent.

  14. The Mathematics of Mixing Things Up

    NASA Astrophysics Data System (ADS)

    Diaconis, Persi

    2011-08-01

    How long should a Markov chain Monte Carlo algorithm be run? Using examples from statistical physics (Ehrenfest urn, Ising model, hard discs) as well as card shuffling, this tutorial paper gives an overview of a body of mathematical results that can give useful answers to practitioners (viz: seven shuffles suffice for practical purposes). It points to new techniques (path coupling, geometric inequalities, and Harris recurrence). The discovery of phase transitions in mixing times (the cutoff phenomenon) is emphasized.

  15. An estimator for the standard deviation of a natural frequency. I.

    NASA Technical Reports Server (NTRS)

    Schiff, A. J.; Bogdanoff, J. L.

    1971-01-01

    A brief review of mean-square approximate systems is given. The case in which the masses are deterministic is considered first in the derivation of an estimator for the upper bound of the standard deviation of a natural frequency. Two examples presented include a two-degree-of-freedom system and a case in which the disorder in the springs is perfectly correlated. For purposes of comparison, a Monte Carlo simulation was done on a digital computer.

  16. Air-kerma evaluation at the maze entrance of HDR brachytherapy facilities.

    PubMed

    Pujades, M C; Granero, D; Vijande, J; Ballester, F; Perez-Calatayud, J; Papagiannis, P; Siebert, F A

    2014-12-01

    In the absence of procedures for evaluating the design of brachytherapy (BT) facilities for radiation protection purposes, the methodology used for external beam radiotherapy facilities is often adapted. The purpose of this study is to adapt the NCRP 151 methodology for estimating the air-kerma rate at the door in BT facilities. Such methodology was checked against Monte Carlo (MC) techniques using the code Geant4. Five different facility designs were studied for (192)Ir and (60)Co HDR applications to account for several different bunker layouts.For the estimation of the lead thickness needed at the door, the use of transmission data for the real spectra at the door instead of the ones emitted by (192)Ir and (60)Co will reduce the lead thickness by a factor of five for (192)Ir and ten for (60)Co. This will significantly lighten the door and hence simplify construction and operating requirements for all bunkers.The adaptation proposed in this study to estimate the air-kerma rate at the door depends on the complexity of the maze: it provides good results for bunkers with a maze (i.e. similar to those used for linacs for which the NCRP 151 methodology was developed) but fails for less conventional designs. For those facilities, a specific Monte Carlo study is in order for reasons of safety and cost-effectiveness.

  17. SU-D-304-07: Application of Proton Boron Fusion Reaction to Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, J; Yoon, D; Shin, H

    Purpose: we present the introduction of a therapy method using the proton boron fusion reaction. The purpose of this study is to verify the theoretical validity of proton boron fusion therapy using Monte Carlo simulations. Methods: After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton’s maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here we show thatmore » the effectiveness of the proton boron fusion therapy (PBFT) was verified using Monte Carlo simulations. Results: We found that a dramatic increase by more than half of the proton’s maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton’s maximum dose point was located within the boron uptake region (BUR). In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. Conclusion: This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.« less

  18. WE-DE-202-02: Are Track Structure Simulations Truly Needed for Radiobiology at the Cellular and Tissue Levels?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  19. Correlation between CT numbers and tissue parameters needed for Monte Carlo simulations of clinical dose distributions

    NASA Astrophysics Data System (ADS)

    Schneider, Wilfried; Bortfeld, Thomas; Schlegel, Wolfgang

    2000-02-01

    We describe a new method to convert CT numbers into mass density and elemental weights of tissues required as input for dose calculations with Monte Carlo codes such as EGS4. As a first step, we calculate the CT numbers for 71 human tissues. To reduce the effort for the necessary fits of the CT numbers to mass density and elemental weights, we establish four sections on the CT number scale, each confined by selected tissues. Within each section, the mass density and elemental weights of the selected tissues are interpolated. For this purpose, functional relationships between the CT number and each of the tissue parameters, valid for media which are composed of only two components in varying proportions, are derived. Compared with conventional data fits, no loss of accuracy is accepted when using the interpolation functions. Assuming plausible values for the deviations of calculated and measured CT numbers, the mass density can be determined with an accuracy better than 0.04 g cm-3 . The weights of phosphorus and calcium can be determined with maximum uncertainties of 1 or 2.3 percentage points (pp) respectively. Similar values can be achieved for hydrogen (0.8 pp) and nitrogen (3 pp). For carbon and oxygen weights, errors up to 14 pp can occur. The influence of the elemental weights on the results of Monte Carlo dose calculations is investigated and discussed.

  20. Evaluation of effective dose with chest digital tomosynthesis system using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Kim, Dohyeon; Jo, Byungdu; Lee, Youngjin; Park, Su-Jin; Lee, Dong-Hoon; Kim, Hee-Joung

    2015-03-01

    Chest digital tomosynthesis (CDT) system has recently been introduced and studied. This system offers the potential to be a substantial improvement over conventional chest radiography for the lung nodule detection and reduces the radiation dose with limited angles. PC-based Monte Carlo program (PCXMC) simulation toolkit (STUK, Helsinki, Finland) is widely used to evaluate radiation dose in CDT system. However, this toolkit has two significant limits. Although PCXMC is not possible to describe a model for every individual patient and does not describe the accurate X-ray beam spectrum, Geant4 Application for Tomographic Emission (GATE) simulation describes the various size of phantom for individual patient and proper X-ray spectrum. However, few studies have been conducted to evaluate effective dose in CDT system with the Monte Carlo simulation toolkit using GATE. The purpose of this study was to evaluate effective dose in virtual infant chest phantom of posterior-anterior (PA) view in CDT system using GATE simulation. We obtained the effective dose at different tube angles by applying dose actor function in GATE simulation which was commonly used to obtain the medical radiation dosimetry. The results indicated that GATE simulation was useful to estimate distribution of absorbed dose. Consequently, we obtained the acceptable distribution of effective dose at each projection. These results indicated that GATE simulation can be alternative method of calculating effective dose in CDT applications.

  1. Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo

    2016-04-01

    Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.

  2. Cellular phone use while driving at night.

    PubMed

    Vivoda, Jonathon M; Eby, David W; St Louis, Renée M; Kostyniuk, Lidia P

    2008-03-01

    Use of a cellular phone has been shown to negatively affect one's attention to the driving task, leading to an increase in crash risk. At any given daylight hour, about 6% of US drivers are actively talking on a hand-held cell phone. However, previous surveys have focused only on cell phone use during the day. Driving at night has been shown to be a riskier activity than driving during the day. The purpose of the current study was to assess the rate of hand-held cellular phone use while driving at night, using specialized night vision equipment. In 2006, two statewide direct observation survey waves of nighttime cellular phone use were conducted in Indiana utilizing specialized night vision equipment. Combined results of driver hand-held cellular phone use from both waves are presented in this manuscript. The rates of nighttime cell phone use were similar to results found in previous daytime studies. The overall rate of nighttime hand-held cellular phone use was 5.8 +/- 0.6%. Cellular phone use was highest for females and for younger drivers. In fact, the highest rate observed during the study (of 11.9%) was for 16-to 29-year-old females. The high level of cellular phone use found within the young age group, coupled with the increased crash risk associated with cellular phone use, nighttime driving, and for young drivers in general, suggests that this issue may become an important transportation-related concern.

  3. Ground-state calculations of confined hydrogen molecule H2 using variational Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Doma, S. B.; El-Gammal, F. N.; Amer, A. A.

    2018-07-01

    The variational Monte Carlo method is used to evaluate the ground-state energy of a confined hydrogen molecule H2. Accordingly, we considered the.me case of hydrogen molecule confined by a hard prolate spheroidal cavity when the nuclear positions are clamped at the foci (on-focus case). Also, the case of off-focus nuclei in which the two nuclei are not clamped to the foci is studied. This case provides flexibility for the treatment of the molecular properties by selecting an arbitrary size and shape for the confining spheroidal box. A simple chemical analysis concerning the catalytic role of enzyme is investigated. An accurate trial wave function depending on many variational parameters is used for this purpose. The obtained results for the case of clamped foci exhibit good accuracy compared with the high precision variational data presented previously. In the case of off-focus nuclei, an improvement is obtained with respect to the most recent uncorrelated results existing in the literature.

  4. Monte Carlo simulation of x-ray buildup factors of lead and its applications in shielding of diagnostic x-ray facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharrati, Hedi; Agrebi, Amel; Karaoui, Mohamed-Karim

    2007-04-15

    X-ray buildup factors of lead in broad beam geometry for energies from 15 to 150 keV are determined using the general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C). The obtained buildup factors data are fitted to a modified three parameter Archer et al. model for ease in calculating the broad beam transmission with computer at any tube potentials/filters combinations in diagnostic energies range. An example for their use to compute the broad beam transmission at 70, 100, 120, and 140 kVp is given. The calculated broad beam transmission is compared to data derived from literature, presenting good agreement.more » Therefore, the combination of the buildup factors data as determined and a mathematical model to generate x-ray spectra provide a computationally based solution to broad beam transmission for lead barriers in shielding x-ray facilities.« less

  5. Image-guided spatial localization of heterogeneous compartments for magnetic resonance

    PubMed Central

    An, Li; Shen, Jun

    2015-01-01

    Purpose: Image-guided localization SPectral Localization Achieved by Sensitivity Heterogeneity (SPLASH) allows rapid measurement of signals from irregularly shaped anatomical compartments without using phase encoding gradients. Here, the authors propose a novel method to address the issue of heterogeneous signal distribution within the localized compartments. Methods: Each compartment was subdivided into multiple subcompartments and their spectra were solved by Tikhonov regularization to enforce smoothness within each compartment. The spectrum of a given compartment was generated by combining the spectra of the components of that compartment. The proposed method was first tested using Monte Carlo simulations and then applied to reconstructing in vivo spectra from irregularly shaped ischemic stroke and normal tissue compartments. Results: Monte Carlo simulations demonstrate that the proposed regularized SPLASH method significantly reduces localization and metabolite quantification errors. In vivo results show that the intracompartment regularization results in ∼40% reduction of error in metabolite quantification. Conclusions: The proposed method significantly reduces localization errors and metabolite quantification errors caused by intracompartment heterogeneous signal distribution. PMID:26328977

  6. Benchmark solution for the Spencer-Lewis equation of electron transport theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganapol, B.D.

    As integrated circuits become smaller, the shielding of these sensitive components against penetrating electrons becomes extremely critical. Monte Carlo methods have traditionally been the method of choice in shielding evaluations primarily because they can incorporate a wide variety of relevant physical processes. Recently, however, as a result of a more accurate numerical representation of the highly forward peaked scattering process, S/sub n/ methods for one-dimensional problems have been shown to be at least as cost-effective in comparison with Monte Carlo methods. With the development of these deterministic methods for electron transport, a need has arisen to assess the accuracy ofmore » proposed numerical algorithms and to ensure their proper coding. It is the purpose of this presentation to develop a benchmark to the Spencer-Lewis equation describing the transport of energetic electrons in solids. The solution will take advantage of the correspondence between the Spencer-Lewis equation and the transport equation describing one-group time-dependent neutron transport.« less

  7. Investigating the response of Micromegas detector to low-energy neutrons using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Khezripour, S.; Negarestani, A.; Rezaie, M. R.

    2017-08-01

    Micromegas detector has recently been used for high-energy neutron (HEN) detection, but the aim of this research is to investigate the response of the Micromegas detector to low-energy neutron (LEN). For this purpose, a Micromegas detector (with air, P10, BF3, 3He and Ar/BF3 mixture) was optimized for the detection of 60 keV neutrons using the MCNP (Monte Carlo N Particle) code. The simulation results show that the optimum thickness of the cathode is 1 mm and the optimum of microgrid location is 100 μm above the anode. The output current of this detector for Ar (3%) + BF3 (97%) mixture is greater than the other ones. This mixture is considered as the appropriate gas for the Micromegas neutron detector providing the output current for 60 keV neutrons at the level of 97.8 nA per neutron. Consecuently, this detector can be introduced as LEN detector.

  8. Monte Carlo calculations for reporting patient organ doses from interventional radiology

    NASA Astrophysics Data System (ADS)

    Huo, Wanli; Feng, Mang; Pi, Yifei; Chen, Zhi; Gao, Yiming; Xu, X. George

    2017-09-01

    This paper describes a project to generate organ dose data for the purposes of extending VirtualDose software from CT imaging to interventional radiology (IR) applications. A library of 23 mesh-based anthropometric patient phantoms were involved in Monte Carlo simulations for database calculations. Organ doses and effective doses of IR procedures with specific beam projection, filed of view (FOV) and beam quality for all parts of body were obtained. Comparing organ doses for different beam qualities, beam projections, patients' ages and patient's body mass indexes (BMIs) which generated by VirtualDose-IR, significant discrepancies were observed. For relatively long time exposure, IR doses depend on beam quality, beam direction and patient size. Therefore, VirtualDose-IR, which is based on the latest anatomically realistic patient phantoms, can generate accurate doses for IR treatment. It is suitable to apply this software in clinical IR dose management as an effective tool to estimate patient doses and optimize IR treatment plans.

  9. Monte Carlo simulation studies on scintillation detectors and image reconstruction of brain-phantom tumors in TOFPET

    PubMed Central

    Mondal, Nagendra Nath

    2009-01-01

    This study presents Monte Carlo Simulation (MCS) results of detection efficiencies, spatial resolutions and resolving powers of a time-of-flight (TOF) PET detector systems. Cerium activated Lutetium Oxyorthosilicate (Lu2SiO5: Ce in short LSO), Barium Fluoride (BaF2) and BriLanCe 380 (Cerium doped Lanthanum tri-Bromide, in short LaBr3) scintillation crystals are studied in view of their good time and energy resolutions and shorter decay times. The results of MCS based on GEANT show that spatial resolution, detection efficiency and resolving power of LSO are better than those of BaF2 and LaBr3, although it possesses inferior time and energy resolutions. Instead of the conventional position reconstruction method, newly established image reconstruction (talked about in the previous work) method is applied to produce high-tech images. Validation is a momentous step to ensure that this imaging method fulfills all purposes of motivation discussed by reconstructing images of two tumors in a brain phantom. PMID:20098551

  10. Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE

    NASA Astrophysics Data System (ADS)

    Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan

    2016-08-01

    The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.

  11. Perturbation Biology: Inferring Signaling Networks in Cellular Systems

    PubMed Central

    Miller, Martin L.; Gauthier, Nicholas P.; Jing, Xiaohong; Kaushik, Poorvi; He, Qin; Mills, Gordon; Solit, David B.; Pratilas, Christine A.; Weigt, Martin; Braunstein, Alfredo; Pagnani, Andrea; Zecchina, Riccardo; Sander, Chris

    2013-01-01

    We present a powerful experimental-computational technology for inferring network models that predict the response of cells to perturbations, and that may be useful in the design of combinatorial therapy against cancer. The experiments are systematic series of perturbations of cancer cell lines by targeted drugs, singly or in combination. The response to perturbation is quantified in terms of relative changes in the measured levels of proteins, phospho-proteins and cellular phenotypes such as viability. Computational network models are derived de novo, i.e., without prior knowledge of signaling pathways, and are based on simple non-linear differential equations. The prohibitively large solution space of all possible network models is explored efficiently using a probabilistic algorithm, Belief Propagation (BP), which is three orders of magnitude faster than standard Monte Carlo methods. Explicit executable models are derived for a set of perturbation experiments in SKMEL-133 melanoma cell lines, which are resistant to the therapeutically important inhibitor of RAF kinase. The resulting network models reproduce and extend known pathway biology. They empower potential discoveries of new molecular interactions and predict efficacious novel drug perturbations, such as the inhibition of PLK1, which is verified experimentally. This technology is suitable for application to larger systems in diverse areas of molecular biology. PMID:24367245

  12. Behavior of optical properties of coagulated blood sample at 633 nm wavelength

    NASA Astrophysics Data System (ADS)

    Morales Cruzado, Beatriz; Vázquez y Montiel, Sergio; Delgado Atencio, José Alberto

    2011-03-01

    Determination of tissue optical parameters is fundamental for application of light in either diagnostics or therapeutical procedures. However, in samples of biological tissue in vitro, the optical properties are modified by cellular death or cellular agglomeration that can not be avoided. This phenomena change the propagation of light within the biological sample. Optical properties of human blood tissue were investigated in vitro at 633 nm using an optical setup that includes a double integrating sphere system. We measure the diffuse transmittance and diffuse reflectance of the blood sample and compare these physical properties with those obtained by Monte Carlo Multi-Layered (MCML). The extraction of the optical parameters: absorption coefficient μa, scattering coefficient μs and anisotropic factor g from the measurements were carried out using a Genetic Algorithm, in which the search procedure is based in the evolution of a population due to selection of the best individual, evaluated by a function that compares the diffuse transmittance and diffuse reflectance of those individuals with the experimental ones. The algorithm converges rapidly to the best individual, extracting the optical parameters of the sample. We compare our results with those obtained by using other retrieve procedures. We found that the scattering coefficient and the anisotropic factor change dramatically due to the formation of clusters.

  13. Interplay of bistable kinetics of gene expression during cellular growth

    NASA Astrophysics Data System (ADS)

    Zhdanov, Vladimir P.

    2009-02-01

    In cells, the bistable kinetics of gene expression can be observed on the level of (i) one gene with positive feedback between protein and mRNA production, (ii) two genes with negative mutual feedback between protein and mRNA production, or (iii) in more complex cases. We analyse the interplay of two genes of type (ii) governed by a gene of type (i) during cellular growth. In particular, using kinetic Monte Carlo simulations, we show that in the case where gene 1, operating in the bistable regime, regulates mutually inhibiting genes 2 and 3, also operating in the bistable regime, the latter genes may eventually be trapped either to the state with high transcriptional activity of gene 2 and low activity of gene 3 or to the state with high transcriptional activity of gene 3 and low activity of gene 2. The probability to get to one of these states depends on the values of the model parameters. If genes 2 and 3 are kinetically equivalent, the probability is equal to 0.5. Thus, our model illustrates how different intracellular states can be chosen at random with predetermined probabilities. This type of kinetics of gene expression may be behind complex processes occurring in cells, e.g., behind the choice of the fate by stem cells.

  14. Memristor-based cellular nonlinear/neural network: design, analysis, and applications.

    PubMed

    Duan, Shukai; Hu, Xiaofang; Dong, Zhekang; Wang, Lidan; Mazumder, Pinaki

    2015-06-01

    Cellular nonlinear/neural network (CNN) has been recognized as a powerful massively parallel architecture capable of solving complex engineering problems by performing trillions of analog operations per second. The memristor was theoretically predicted in the late seventies, but it garnered nascent research interest due to the recent much-acclaimed discovery of nanocrossbar memories by engineers at the Hewlett-Packard Laboratory. The memristor is expected to be co-integrated with nanoscale CMOS technology to revolutionize conventional von Neumann as well as neuromorphic computing. In this paper, a compact CNN model based on memristors is presented along with its performance analysis and applications. In the new CNN design, the memristor bridge circuit acts as the synaptic circuit element and substitutes the complex multiplication circuit used in traditional CNN architectures. In addition, the negative differential resistance and nonlinear current-voltage characteristics of the memristor have been leveraged to replace the linear resistor in conventional CNNs. The proposed CNN design has several merits, for example, high density, nonvolatility, and programmability of synaptic weights. The proposed memristor-based CNN design operations for implementing several image processing functions are illustrated through simulation and contrasted with conventional CNNs. Monte-Carlo simulation has been used to demonstrate the behavior of the proposed CNN due to the variations in memristor synaptic weights.

  15. Biological characterization of a novel in vitro cell irradiator

    PubMed Central

    Fowler, Tyler L.; Fisher, Michael M.; Bailey, Alison M.; Bednarz, Bryan P.

    2017-01-01

    To evaluate the overall robustness of a novel cellular irradiator we performed a series of well-characterized, dose-responsive assays to assess the consequences of DNA damage. We used a previously described novel irradiation system and a traditional 137Cs source to irradiate a cell line. The generation of reactive oxygen species was assessed using chloromethyl-H2DCFDA dye, the induction of DNA DSBs was observed using the comet assay, and the initiation of DNA break repair was assessed through γH2AX image cytometry. A high correlation between physical absorbed dose and biologic dose was seen for the production of intracellular reactive oxygen species, physical DNA double strand breaks, and modulation of the cellular double stand break pathway. The results compared favorably to irradiation with a traditional 137Cs source. The rapid, straightforward tests described form a reasonable approach for biologic characterization of novel irradiators. These additional testing metrics go beyond standard physics testing such as Monte Carlo simulation and thermo-luminescent dosimeter evaluation to confirm that a novel irradiator can produce the desired dose effects in vitro. Further, assessment of these biological metrics confirms that the physical handling of the cells during the irradiation process results in biologic effects that scale appropriately with dose. PMID:29232400

  16. Accuracy of Monte Carlo photon transport simulation in characterizing brachytherapy dosimeter energy-response artefacts.

    PubMed

    Das, R K; Li, Z; Perera, H; Williamson, J F

    1996-06-01

    Practical dosimeters in brachytherapy, such as thermoluminescent dosimeters (TLD) and diodes, are usually calibrated against low-energy megavoltage beams. To measure absolute dose rate near a brachytherapy source, it is necessary to establish the energy response of the detector relative to that of the calibration energy. The purpose of this paper is to assess the accuracy of Monte Carlo photon transport (MCPT) simulation in modelling the absolute detector response as a function of detector geometry and photon energy. We have exposed two different sizes of TLD-100 (LiF chips) and p-type silicon diode detectors to calibrated 60Co, HDR source (192Ir) and superficial x-ray beams. For the Scanditronix electron-field diode, the relative detector response, defined as the measured detector readings per measured unit of air kerma, varied from 38.46 V cGy-1 (40 kVp beam) to 6.22 V cGy-1 (60Co beam). Similarly for the large and small chips the same quantity varied from 2.08-3.02 nC cGy-1 and 0.171-0.244 nC cGy-1, respectively. Monte Carlo simulation was used to calculate the absorbed dose to the active volume of the detector per unit air kerma. If the Monte Carlo simulation is accurate, then the absolute detector response, which is defined as the measured detector reading per unit dose absorbed by the active detector volume, and is calculated by Monte Carlo simulation, should be a constant. For the diode, the absolute response is 5.86 +/- 0.15 (V cGy-1). For TLDs of size 3 x 3 x 1 mm3 the absolute response is 2.47 +/- 0.07 (nC cGy-1) and for TLDs of 1 x 1 x 1 mm3 it is 0.201 +/- 0.008 (nC cGy-1). From the above results we can conclude that the absolute response function of detectors (TLDs and diodes) is directly proportional to absorbed dose by the active volume of the detector and is independent of beam quality.

  17. TU-H-207A-02: Relative Importance of the Various Factors Influencing the Accuracy of Monte Carlo Simulated CT Dose Index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marous, L; Muryn, J; Liptak, C

    2016-06-15

    Purpose: Monte Carlo simulation is a frequently used technique for assessing patient dose in CT. The accuracy of a Monte Carlo program is often validated using the standard CT dose index (CTDI) phantoms by comparing simulated and measured CTDI{sub 100}. To achieve good agreement, many input parameters in the simulation (e.g., energy spectrum and effective beam width) need to be determined. However, not all the parameters have equal importance. Our aim was to assess the relative importance of the various factors that influence the accuracy of simulated CTDI{sub 100}. Methods: A Monte Carlo program previously validated for a clinical CTmore » system was used to simulate CTDI{sub 100}. For the standard CTDI phantoms (32 and 16 cm in diameter), CTDI{sub 100} values from central and four peripheral locations at 70 and 120 kVp were first simulated using a set of reference input parameter values (treated as the truth). To emulate the situation in which the input parameter values used by the researcher may deviate from the truth, additional simulations were performed in which intentional errors were introduced into the input parameters, the effects of which on simulated CTDI{sub 100} were analyzed. Results: At 38.4-mm collimation, errors in effective beam width up to 5.0 mm showed negligible effects on simulated CTDI{sub 100} (<1.0%). Likewise, errors in acrylic density of up to 0.01 g/cm{sup 3} resulted in small CTDI{sub 100} errors (<2.5%). In contrast, errors in spectral HVL produced more significant effects: slight deviations (±0.2 mm Al) produced errors up to 4.4%, whereas more extreme deviations (±1.4 mm Al) produced errors as high as 25.9%. Lastly, ignoring the CT table introduced errors up to 13.9%. Conclusion: Monte Carlo simulated CTDI{sub 100} is insensitive to errors in effective beam width and acrylic density. However, they are sensitive to errors in spectral HVL. To obtain accurate results, the CT table should not be ignored. This work was supported by a Faculty Research and Development Award from Cleveland State University.« less

  18. The impact of low-Z and high-Z metal implants in IMRT: A Monte Carlo study of dose inaccuracies in commercial dose algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spadea, Maria Francesca, E-mail: mfspadea@unicz.it; Verburg, Joost Mathias; Seco, Joao

    2014-01-15

    Purpose: The aim of the study was to evaluate the dosimetric impact of low-Z and high-Z metallic implants on IMRT plans. Methods: Computed tomography (CT) scans of three patients were analyzed to study effects due to the presence of Titanium (low-Z), Platinum and Gold (high-Z) inserts. To eliminate artifacts in CT images, a sinogram-based metal artifact reduction algorithm was applied. IMRT dose calculations were performed on both the uncorrected and corrected images using a commercial planning system (convolution/superposition algorithm) and an in-house Monte Carlo platform. Dose differences between uncorrected and corrected datasets were computed and analyzed using gamma index (Pγ{submore » <1}) and setting 2 mm and 2% as distance to agreement and dose difference criteria, respectively. Beam specific depth dose profiles across the metal were also examined. Results: Dose discrepancies between corrected and uncorrected datasets were not significant for low-Z material. High-Z materials caused under-dosage of 20%–25% in the region surrounding the metal and over dosage of 10%–15% downstream of the hardware. Gamma index test yielded Pγ{sub <1}>99% for all low-Z cases; while for high-Z cases it returned 91% < Pγ{sub <1}< 99%. Analysis of the depth dose curve of a single beam for low-Z cases revealed that, although the dose attenuation is altered inside the metal, it does not differ downstream of the insert. However, for high-Z metal implants the dose is increased up to 10%–12% around the insert. In addition, Monte Carlo method was more sensitive to the presence of metal inserts than superposition/convolution algorithm. Conclusions: The reduction in terms of dose of metal artifacts in CT images is relevant for high-Z implants. In this case, dose distribution should be calculated using Monte Carlo algorithms, given their superior accuracy in dose modeling in and around the metal. In addition, the knowledge of the composition of metal inserts improves the accuracy of the Monte Carlo dose calculation significantly.« less

  19. TH-AB-207A-07: Radiation Dose Simulation for a Newly Proposed Dynamic Bowtie Filters for CT Using Fast Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Gao, Y

    Purpose: Dynamic bowtie filter is an innovative design capable of modulating the X-ray and balancing the flux in the detectors, and it introduces a new way of patient-specific CT scan optimizations. This study demonstrates the feasibility of performing fast Monte Carlo dose calculation for a type of dynamic bowtie filter for cone-beam CT (Liu et al. 2014 9(7) PloS one) using MIC coprocessors. Methods: The dynamic bowtie filter in question consists of a highly attenuating bowtie component (HB) and a weakly attenuating bowtie (WB). The HB is filled with CeCl3 solution and its surface is defined by a transcendental equation.more » The WB is an elliptical cylinder filled with air and immersed in the HB. As the scanner rotates, the orientation of WB remains the same with the static patient. In our Monte Carlo simulation, the HB was approximated by 576 boxes. The phantom was a voxelized elliptical cylinder composed of PMMA and surrounded by air (44cm×44cm×40cm, 1000×1000×1 voxels). The dose to the PMMA phantom was tallied with 0.15% statistical uncertainty under 100 kVp source. Two Monte Carlo codes ARCHER and MCNP-6.1 were compared. Both used double-precision. Compiler flags that may trade accuracy for speed were avoided. Results: The wall time of the simulation was 25.4 seconds by ARCHER on a 5110P MIC, 40 seconds on a X5650 CPU, and 523 seconds by the multithreaded MCNP on the same CPU. The high performance of ARCHER is attributed to the parameterized geometry and vectorization of the program hotspots. Conclusion: The dynamic bowtie filter modeled in this study is able to effectively reduce the dynamic range of the detected signals for the photon-counting detectors. With appropriate software optimization methods, the accelerator-based (MIC and GPU) Monte Carlo dose engines have shown good performance and can contribute to patient-specific CT scan optimizations.« less

  20. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    PubMed Central

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697

  1. SU-E-J-60: Efficient Monte Carlo Dose Calculation On CPU-GPU Heterogeneous Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, K; Chen, D. Z; Hu, X. S

    Purpose: It is well-known that the performance of GPU-based Monte Carlo dose calculation implementations is bounded by memory bandwidth. One major cause of this bottleneck is the random memory writing patterns in dose deposition, which leads to several memory efficiency issues on GPU such as un-coalesced writing and atomic operations. We propose a new method to alleviate such issues on CPU-GPU heterogeneous systems, which achieves overall performance improvement for Monte Carlo dose calculation. Methods: Dose deposition is to accumulate dose into the voxels of a dose volume along the trajectories of radiation rays. Our idea is to partition this proceduremore » into the following three steps, which are fine-tuned for CPU or GPU: (1) each GPU thread writes dose results with location information to a buffer on GPU memory, which achieves fully-coalesced and atomic-free memory transactions; (2) the dose results in the buffer are transferred to CPU memory; (3) the dose volume is constructed from the dose buffer on CPU. We organize the processing of all radiation rays into streams. Since the steps within a stream use different hardware resources (i.e., GPU, DMA, CPU), we can overlap the execution of these steps for different streams by pipelining. Results: We evaluated our method using a Monte Carlo Convolution Superposition (MCCS) program and tested our implementation for various clinical cases on a heterogeneous system containing an Intel i7 quad-core CPU and an NVIDIA TITAN GPU. Comparing with a straightforward MCCS implementation on the same system (using both CPU and GPU for radiation ray tracing), our method gained 2-5X speedup without losing dose calculation accuracy. Conclusion: The results show that our new method improves the effective memory bandwidth and overall performance for MCCS on the CPU-GPU systems. Our proposed method can also be applied to accelerate other Monte Carlo dose calculation approaches. This research was supported in part by NSF under Grants CCF-1217906, and also in part by a research contract from the Sandia National Laboratories.« less

  2. SU-E-T-391: Assessment and Elimination of the Angular Dependence of the Response of the NanoDot OSLD System in MV Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehmann, J; University of Sydney, Sydney; RMIT University, Melbourne

    2014-06-01

    Purpose: Assess the angular dependence of the nanoDot OSLD system in MV X-ray beams at depths and mitigate this dependence for measurements in phantoms. Methods: Measurements for 6 MV photons at 3 cm and 10 cm depth and Monte Carlo simulations were performed. Two special holders were designed which allow a nanoDot dosimeter to be rotated around the center of its sensitive volume (5 mm diameter disk). The first holder positions the dosimeter disk perpendicular to the beam (en-face). It then rotates until the disk is parallel with the beam (edge on). This is referred to as Setup 1. Themore » second holder positions the disk parallel to the beam (edge on) for all angles (Setup 2). Monte Carlo simulations using GEANT4 considered detector and housing in detail based on microCT data. Results: An average drop in response by 1.4±0.7% (measurement) and 2.1±0.3% (Monte Carlo) for the 90° orientation compared to 0° was found for Setup 1. Monte Carlo simulations also showed a strong dependence of the effect on the composition of the sensitive layer. Assuming 100% active material (Al??O??) results in a 7% drop in response for 90° compared to 0°. Assuming the layer to be completely water, results in a flat response (within simulation uncertainty of about 1%). For Setup 2, measurements and Monte Carlo simulations found the angular dependence of the dosimeter to be below 1% and within the measurement uncertainty. Conclusion: The nanoDot dosimeter system exhibits a small angular dependence off approximately 2%. Changing the orientation of the dosimeter so that a coplanar beam arrangement always hits the detector material edge on reduces the angular dependence to within the measurement uncertainty of about 1%. This makes the dosimeter more attractive for phantom based clinical measurements and audits with multiple coplanar beams. The Australian Clinical Dosimetry Service is a joint initiative between the Australian Department of Health and the Australian Radiation Protection and Nuclear Safety Agency.« less

  3. Short Message Service (SMS) Texting Symbols: A Functional Analysis of 10,000 Cellular Phone Text Messages

    ERIC Educational Resources Information Center

    Beasley, Robert E.

    2009-01-01

    The purpose of this study was to investigate the use of symbolic expressions (e.g., "BTW," "LOL," "UR") in an SMS text messaging corpus consisting of over 10,000 text messages. More specifically, the purpose was to determine, not only how frequently these symbolic expressions are used, but how they are utilized in terms of the language functions…

  4. Are There Roles for Brain Cell Senescence in Aging and Neurodegenerative Disorders?

    PubMed Central

    Tan, Florence C. C.; Hutchison, Emmette R.; Eitan, Erez; Mattson, Mark P.

    2014-01-01

    The term cellular senescence was introduced more than five decades ago to describe the state of growth arrest observed in aging cells. Since this initial discovery, the phenotypes associated with cellular senescence have expanded beyond growth arrest to include alterations in cellular metabolism, secreted cytokines, epigenetic regulation and protein expression. Recently, senescence has been shown to play an important role in vivo not only in relation to aging, but also during embryonic development. Thus, cellular senescence serves different purposes and comprises a wide range of distinct phenotypes across multiple cell types. Whether all cell types, including post-mitotic neurons, are capable of entering into a senescent state remains unclear. In this review we examine recent data that suggest that cellular senescence plays a role in brain aging and, notably, may not be limited to glia but also neurons. We suggest that there is a high level of similarity between some of the pathological changes that occur in the brain in Alzheimer’s and Parkinson’s diseases and those phenotypes observed in cellular senescence, leading us to propose that neurons and glia can exhibit hallmarks of senescence previously documented in peripheral tissues. PMID:25305051

  5. Are there roles for brain cell senescence in aging and neurodegenerative disorders?

    PubMed

    Tan, Florence C C; Hutchison, Emmette R; Eitan, Erez; Mattson, Mark P

    2014-12-01

    The term cellular senescence was introduced more than five decades ago to describe the state of growth arrest observed in aging cells. Since this initial discovery, the phenotypes associated with cellular senescence have expanded beyond growth arrest to include alterations in cellular metabolism, secreted cytokines, epigenetic regulation and protein expression. Recently, senescence has been shown to play an important role in vivo not only in relation to aging, but also during embryonic development. Thus, cellular senescence serves different purposes and comprises a wide range of distinct phenotypes across multiple cell types. Whether all cell types, including post-mitotic neurons, are capable of entering into a senescent state remains unclear. In this review we examine recent data that suggest that cellular senescence plays a role in brain aging and, notably, may not be limited to glia but also neurons. We suggest that there is a high level of similarity between some of the pathological changes that occur in the brain in Alzheimer's and Parkinson's diseases and those phenotypes observed in cellular senescence, leading us to propose that neurons and glia can exhibit hallmarks of senescence previously documented in peripheral tissues.

  6. The mTOR inhibitor sirolimus suppresses renal, hepatic, and cardiac tissue cellular respiration.

    PubMed

    Albawardi, Alia; Almarzooqi, Saeeda; Saraswathiamma, Dhanya; Abdul-Kader, Hidaya Mohammed; Souid, Abdul-Kader; Alfazari, Ali S

    2015-01-01

    The purpose of this in vitro study was to develop a useful biomarker (e.g., cellular respiration, or mitochondrial O2 consumption) for measuring activities of mTOR inhibitors. It measured the effects of commonly used immunosuppressants (sirolimus-rapamycin, tacrolimus, and cyclosporine) on cellular respiration in target tissues (kidney, liver, and heart) from C57BL/6 mice. The mammalian target of rapamycin (mTOR), a serine/ threonine kinase that supports nutrient-dependent cell growth and survival, is known to control energy conversion processes within the mitochondria. Consistently, inhibitors of mTOR (e.g., rapamycin, also known as sirolimus or Rapamune®) have been shown to impair mitochondrial function. Inhibitors of the calcium-dependent serine/threonine phosphatase calcineurin (e.g., tacrolimus and cyclosporine), on the other hand, strictly prevent lymphokine production leading to a reduced T-cell function. Sirolimus (10 μM) inhibited renal (22%, P=0.002), hepatic (39%, P<0.001), and cardiac (42%, P=0.005) cellular respiration. Tacrolimus and cyclosporine had no or minimum effects on cellular respiration in these tissues. Thus, these results clearly demonstrate that impaired cellular respiration (bioenergetics) is a sensitive biomarker of the immunosuppressants that target mTOR.

  7. [Discriminating power of socio-demographic and psychological variables on addictive use of cellular phones among middle school students].

    PubMed

    Lee, Haejung; Kim, Myoung Soo; Son, Hyun Kyung; Ahn, Sukhee; Kim, Jung Soon; Kim, Young Hae

    2007-10-01

    The purpose of this study was to examine the degrees of cellular phone usage among middle school students and to identify discriminating factors of addictive use of cellular phones among sociodemographic and psychological variables. From 123 middle schools in Busan, potential participants were identified through stratified random sampling and 747 middle school students participated in the study. The data was collected from December 1, 2004 to December 30, 2004. Descriptive and discriminant analyses were used. Fifty seven percent of the participants were male and 89.7% used cellular phones at school. The participants were grouped into three groups depending on the levels of the cellular phone usage: addicted (n=117), dependent (n=418), non-addicted (n=212). Within the three groups, two functions were produced and only one function was significant, discriminating the addiction group from non-addiction group. Additional discriminant analysis with only two groups produced one function that classified 81.2% of the participants correctly into the two groups. Impulsiveness, anxiety, and stress were significant discriminating factors. Based on the findings of this study, developing intervention programs focusing on impulsiveness, anxiety and stress to reduce the possible addictive use of cellular phones is suggested.

  8. Organ S values and effective doses for family members exposed to adult patients following I-131 treatment: A Monte Carlo simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Eun Young; Lee, Choonsik; Mcguire, Lynn

    Purpose: To calculate organ S values (mGy/Bq-s) and effective doses per time-integrated activity (mSv/Bq-s) for pediatric and adult family members exposed to an adult male or female patient treated with I-131 using a series of hybrid computational phantoms coupled with a Monte Carlo radiation transport technique.Methods: A series of pediatric and adult hybrid computational phantoms were employed in the study. Three different exposure scenarios were considered: (1) standing face-to-face exposures between an adult patient and pediatric or adult family phantoms at five different separation distances; (2) an adult female patient holding her newborn child, and (3) a 1-yr-old child standingmore » on the lap of an adult female patient. For the adult patient model, two different thyroid-related diseases were considered: hyperthyroidism and differentiated thyroid cancer (DTC) with corresponding internal distributions of {sup 131}I. A general purpose Monte Carlo code, MCNPX v2.7, was used to perform the Monte Carlo radiation transport.Results: The S values show a strong dependency on age and organ location within the family phantoms at short distances. The S values and effective dose per time-integrated activity from the adult female patient phantom are relatively high at shorter distances and to younger family phantoms. At a distance of 1 m, effective doses per time-integrated activity are lower than those values based on the NRC (Nuclear Regulatory Commission) by a factor of 2 for both adult male and female patient phantoms. The S values to target organs from the hyperthyroid-patient source distribution strongly depend on the height of the exposed family phantom, so that their values rapidly decrease with decreasing height of the family phantom. Active marrow of the 10-yr-old phantom shows the highest S values among family phantoms for the DTC-patient source distribution. In the exposure scenario of mother and baby, S values and effective doses per time-integrated activity to the newborn and 1-yr-old phantoms for a hyperthyroid-patient source are higher than values for a DTC-patient source.Conclusions: The authors performed realistic assessments of {sup 131}I organ S values and effective dose per time-integrated activity from adult patients treated for hyperthyroidism and DTC to family members. In addition, the authors’ studies consider Monte Carlo simulated “mother and baby/child” exposure scenarios for the first time. Based on these results, the authors reconfirm the strong conservatism underlying the point source method recommended by the US NRC. The authors recommend that various factors such as the type of the patient's disease, the age of family members, and the distance/posture between the patient and family members must be carefully considered to provide realistic dose estimates for patient-to-family exposures.« less

  9. Driver hand-held cellular phone use: a four-year analysis.

    PubMed

    Eby, David W; Vivoda, Jonathon M; St Louis, Renée M

    2006-01-01

    The use of hand-held cellular (mobile) phones while driving has stirred more debate, passion, and research than perhaps any other traffic safety issue in the past several years. There is ample research showing that the use of either hand-held or hands-free cellular phones can lead to unsafe driving patterns. Whether or not these performance deficits increase the risk of crash is difficult to establish, but recent studies are beginning to suggest that cellular phone use elevates crash risk. The purpose of this study was to assess changes in the rate of hand-held cellular phone use by motor-vehicle drivers on a statewide level in Michigan. This study presents the results of 13 statewide surveys of cellular phone use over a 4-year period. Hand-held cellular phone use data were collected through direct observation while vehicles were stopped at intersections and freeway exit ramps. Data were weighted to be representative of all drivers traveling during daylight hours in Michigan. The study found that driver hand-held cellular phone use has more than doubled between 2001 and 2005, from 2.7% to 5.8%. This change represents an average increase of 0.78 percentage points per year. The 5.8% use rate observed in 2005 means that at any given daylight hour, around 36,550 drivers were conversing on cellular phones while driving on Michigan roadways. The trend line fitted to these data predicts that by the year 2010, driver hand-held cellular phone use will be around 8.6%, or 55,000 drivers at any given daylight hour. These results make it clear that cellular phone use while driving will continue to be an important traffic safety issue, and highlight the importance of continued attempts to generate new ways of alleviating this potential hazard.

  10. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  11. Organization and use of a Software/Hardware Avionics Research Program (SHARP)

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.; Kareemi, M. N.

    1975-01-01

    The organization and use is described of the software/hardware avionics research program (SHARP) developed to duplicate the automatic portion of the STOLAND simulator system, on a general-purpose computer system (i.e., IBM 360). The program's uses are: (1) to conduct comparative evaluation studies of current and proposed airborne and ground system concepts via single run or Monte Carlo simulation techniques, and (2) to provide a software tool for efficient algorithm evaluation and development for the STOLAND avionics computer.

  12. Monte Carlo Determination of Gamma Ray Exposure from a Homogeneous Ground Plane

    DTIC Science & Technology

    1990-03-01

    A HOMOGENEOUS GROUND PLANE SOURCE THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology Air University...come from a standard ANISN format library called FEWG1-85. This state-of-the- art cross section library which contains 37 neutron energy groups and 21...purpose. The FEWGl library, a state-of-the- art cross section library developed for the Defense Nuclear Agency con- sisting of 21 gamma-ray enerQj

  13. CHARYBDIS: a black hole event generator

    NASA Astrophysics Data System (ADS)

    Harris, Christopher M.; Richardson, Peter; Webber, Bryan R.

    2003-08-01

    CHARYBDIS is an event generator which simulates the production and decay of miniature black holes at hadronic colliders as might be possible in certain extra dimension models. It interfaces via the Les Houches accord to general purpose Monte Carlo programs like HERWIG and PYTHIA which then perform the parton evolution and hadronization. The event generator includes the extra-dimensional `grey-body' effects as well as the change in the temperature of the black hole as the decay progresses. Various options for modelling the Planck-scale terminal decay are provided.

  14. Gauge-independent decoherence models for solids in external fields

    NASA Astrophysics Data System (ADS)

    Wismer, Michael S.; Yakovlev, Vladislav S.

    2018-04-01

    We demonstrate gauge-invariant modeling of an open system of electrons in a periodic potential interacting with an optical field. For this purpose, we adapt the covariant derivative to the case of mixed states and put forward a decoherence model that has simple analytical forms in the length and velocity gauges. We demonstrate our methods by calculating harmonic spectra in the strong-field regime and numerically verifying the equivalence of the deterministic master equation to the stochastic Monte Carlo wave-function method.

  15. Catch-slip bonds can be dispensable for motor force regulation during skeletal muscle contraction

    NASA Astrophysics Data System (ADS)

    Dong, Chenling; Chen, Bin

    2015-07-01

    It is intriguing how multiple molecular motors can perform coordinated and synchronous functions, which is essential in various cellular processes. Recent studies on skeletal muscle might have shed light on this issue, where rather precise motor force regulation was partly attributed to the specific stochastic features of a single attached myosin motor. Though attached motors can randomly detach from actin filaments either through an adenosine triphosphate (ATP) hydrolysis cycle or through "catch-slip bond" breaking, their respective contribution in motor force regulation has not been clarified. Here, through simulating a mechanical model of sarcomere with a coupled Monte Carlo method and finite element method, we find that the stochastic features of an ATP hydrolysis cycle can be sufficient while those of catch-slip bonds can be dispensable for motor force regulation.

  16. Monte Carlo simulation and film dosimetry for electron therapy in vicinity of a titanium mesh

    PubMed Central

    Rostampour, Masoumeh; Roayaei, Mahnaz

    2014-01-01

    Titanium (Ti) mesh plates are used as a bone replacement in brain tumor surgeries. In the case of radiotherapy, these plates might interfere with the beam path. The purpose of this study is to evaluate the effect of titanium mesh on the dose distribution of electron fields. Simulations were performed using Monte Carlo BEAMnrc and DOSXYZnrc codes for 6 and 10 MeV electron beams. In Monte Carlo simulation, the shape of the titanium mesh was simulated. The simulated titanium mesh was considered as the one which is used in head and neck surgery with a thickness of 0.055 cm. First, by simulation, the percentage depth dose was obtained while the titanium mesh was present, and these values were then compared with the depth dose of homogeneous phantom with no titanium mesh. In the experimental measurements, the values of depth dose with titanium mesh and without titanium mesh in various depths were measured. The experiments were performed using a RW3 phantom with GAFCHROMIC EBT2 film. The results of experimental measurements were compared with values of depth dose obtained by simulation. In Monte Carlo simulation, as well as experimental measurements, for the voxels immediately beyond the titanium mesh, the change of the dose were evaluated. For this purpose the ratio of the dose for the case with titanium to the case without titanium was calculated as a function of titanium depth. For the voxels before the titanium mesh there was always an increase of the dose up to 13% with respect to the same voxel with no titanium mesh. This is because of the increased back scattering effect of the titanium mesh. The results also showed that for the voxel right beyond the titanium mesh, there is an increased or decreased dose to soft tissues, depending on the depth of the titanium mesh. For the regions before the depth of maximum dose, there is an increase of the dose up to 10% compared to the dose of the same depth in homogeneous phantom. Beyond the depth of maximum dose, there was a 16% decrease in dose. For both 6 and 10 MeV, before the titanium mesh, there was always an increase in dose. If titanium mesh is placed in buildup region, it causes an increase of the dose and could lead to overdose of the adjacent tissue, whereas if titanium mesh is placed beyond the buildup region, it would lead to a decrease in dose compared to the homogenous tissue. PACS number: 87.53.Bn PMID:25207397

  17. Estimating peak skin and eye lens dose from neuroperfusion examinations: Use of Monte Carlo based simulations and comparisons to CTDIvol, AAPM Report No. 111, and ImPACT dosimetry tool values

    PubMed Central

    Zhang, Di; Cagnon, Chris H.; Villablanca, J. Pablo; McCollough, Cynthia H.; Cody, Dianna D.; Zankl, Maria; Demarco, John J.; McNitt-Gray, Michael F.

    2013-01-01

    Purpose: CT neuroperfusion examinations are capable of delivering high radiation dose to the skin or lens of the eyes of a patient and can possibly cause deterministic radiation injury. The purpose of this study is to: (a) estimate peak skin dose and eye lens dose from CT neuroperfusion examinations based on several voxelized adult patient models of different head size and (b) investigate how well those doses can be approximated by some commonly used CT dose metrics or tools, such as CTDIvol, American Association of Physicists in Medicine (AAPM) Report No. 111 style peak dose measurements, and the ImPACT organ dose calculator spreadsheet. Methods: Monte Carlo simulation methods were used to estimate peak skin and eye lens dose on voxelized patient models, including GSF's Irene, Frank, Donna, and Golem, on four scanners from the major manufacturers at the widest collimation under all available tube potentials. Doses were reported on a per 100 mAs basis. CTDIvol measurements for a 16 cm CTDI phantom, AAPM Report No. 111 style peak dose measurements, and ImPACT calculations were performed for available scanners at all tube potentials. These were then compared with results from Monte Carlo simulations. Results: The dose variations across the different voxelized patient models were small. Dependent on the tube potential and scanner and patient model, CTDIvol values overestimated peak skin dose by 26%–65%, and overestimated eye lens dose by 33%–106%, when compared to Monte Carlo simulations. AAPM Report No. 111 style measurements were much closer to peak skin estimates ranging from a 14% underestimate to a 33% overestimate, and with eye lens dose estimates ranging from a 9% underestimate to a 66% overestimate. The ImPACT spreadsheet overestimated eye lens dose by 2%–82% relative to voxelized model simulations. Conclusions: CTDIvol consistently overestimates dose to eye lens and skin. The ImPACT tool also overestimated dose to eye lenses. As such they are still useful as a conservative predictor of dose for CT neuroperfusion studies. AAPM Report No. 111 style measurements are a better predictor of both peak skin and eye lens dose than CTDIvol and ImPACT for the patient models used in this study. It should be remembered that both the AAPM Report No. 111 peak dose metric and CTDIvol dose metric are dose indices and were not intended to represent actual organ doses. PMID:24007152

  18. From Stochastic Foam to Designed Structure: Balancing Cost and Performance of Cellular Metals

    PubMed Central

    Lehmhus, Dirk; Vesenjak, Matej

    2017-01-01

    Over the past two decades, a large number of metallic foams have been developed. In recent years research on this multi-functional material class has further intensified. However, despite their unique properties only a limited number of large-scale applications have emerged. One important reason for this sluggish uptake is their high cost. Many cellular metals require expensive raw materials, complex manufacturing procedures, or a combination thereof. Some attempts have been made to decrease costs by introducing novel foams based on cheaper components and new manufacturing procedures. However, this has often yielded materials with unreliable properties that inhibit utilization of their full potential. The resulting balance between cost and performance of cellular metals is probed in this editorial, which attempts to consider cost not in absolute figures, but in relation to performance. To approach such a distinction, an alternative classification of cellular metals is suggested which centers on structural aspects and the effort of realizing them. The range thus covered extends from fully stochastic foams to cellular structures designed-to-purpose. PMID:28786935

  19. Monte Carlo modeling of the MammoSite(Reg) treatments: Dose effects of air pockets

    NASA Astrophysics Data System (ADS)

    Huang, Yu-Huei Jessica

    In the treatment of early-stage breast cancer, MammoSiteRTM has been used as one of the partial breast irradiation techniques after breast-conserving surgery. The MammoSiteRTM applicator is a single catheter with an inflatable balloon at its distal end that can be placed in the resected cavity (tumor bed). The treatment is performed by delivering the Ir-192 high-dose-rate source through the center lumen of the catheter by a remote afterloader while the balloon is inflated in the tumor bed cavity. In the MammoSiteRTM treatment, it has been found that air pockets occasionally exist and can be seen and measured in CT images. Experiences have shown that about 90% of the patients have air pockets when imaged two days after the balloon placement. The criterion for the air pocket volume is less than or equal to 10% of the planning target volume in volume. The purpose of this study is to quantify dose errors occurring at the interface of the air pocket in MammoSiteRTM treatments with Monte Carlo calculations, so that the dosimetric effects from the air pocket can be fully understood. Modern brachytherapy treatment planning systems typically consider patient anatomy as a homogeneous water medium, and incorrectly model lateral and backscatter radiation during treatment delivery. Heterogeneities complicate the problem and may result in overdosage to the tissue located near the medium interface. This becomes a problem in MammoSiteRTM brachytherapy when air pocket appears during the treatment. The resulting percentage dose difference near the air-tissue interface is hypothesized to be greater than 10% when comparing Monte Carlo N-Particle (version 5) with current treatment planning systems. The specific aims for this study are: (1) Validate Monte Carlo N-Particle (Version 5) source modeling. (2) Develop phantom. (3) Calculate phantom doses with Monte Carlo N-Particle (Version 5) and investigate doses difference between thermoluminescent dosimeter measurement, treatment planning system, and Monte Carlo results. (4) Calculate dose differences for various treatment parameters. The results from thermoliminescent dosimeter phantom measurements proves that with correct geometric and source models, Monte Carlo method can be used to estimate homogeneity and heterogeneity doses in MammoSiteRTM treatment. The resulting dose differences at various points of interests in Monte Carlo calculations were presented and compared between different calculation methods. The air pocket doses were found to be underestimated by the treatment planning system. It was concluded that after correcting for inverse square law, the underestimation error from the treatment planning system will be less than +/- 2.0%, and +/- 3.5%, at the air pocket surface and air pocket planning target volume, respectively, when comparing Monte Carlo N-Particle (version 5) results. If the skin surface is located close to the air pocket, the underestimation effect at the air pocket surface and air pocket planning target volume doses becomes less because the air outside of the skin surface reduces the air pocket inhomogeneity effect. In order to maintain appropriate skin dose within tolerance, the skin surface criterion should be considered as the smallest thickness of the breast tissue located between the air pocket and the skin surface. The thickness should be at least 5 mm. In conclusion, the air pocket outside the balloon had less than 10% inhomogeneity effect based on the situations studied. It is recommended that at least an inverse square correction should be taken into consideration in order to relate clinical outcomes to actual delivered doses to the air pocket and surrounding tissues.

  20. Teen driver cell phone blocker.

    DOT National Transportation Integrated Search

    2012-02-01

    This study was a randomized control intervention to measure the effectiveness of a cellular phone control device : that communicates with the vehicles of teen drivers to deny them access to their phone while driving for the : purpose of reducing dist...

  1. Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.

    PubMed

    Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao

    2012-08-01

    Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.

  2. An examination of land use impacts of flooding induced by sea level rise

    NASA Astrophysics Data System (ADS)

    Song, Jie; Fu, Xinyu; Gu, Yue; Deng, Yujun; Peng, Zhong-Ren

    2017-03-01

    Coastal regions become unprecedentedly vulnerable to coastal hazards that are associated with sea level rise. The purpose of this paper is therefore to simulate prospective urban exposure to changing sea levels. This article first applied the cellular-automaton-based SLEUTH model (Project Gigalopolis, 2016) to calibrate historical urban dynamics in Bay County, Florida (USA) - a region that is greatly threatened by rising sea levels. This paper estimated five urban growth parameters by multiple-calibration procedures that used different Monte Carlo iterations to account for modeling uncertainties. It then employed the calibrated model to predict three scenarios of urban growth up to 2080 - historical trend, urban sprawl, and compact development. We also assessed land use impacts of four policies: no regulations; flood mitigation plans based on the whole study region and on those areas that are prone to experience growth; and the protection of conservational lands. This study lastly overlaid projected urban areas in 2030 and 2080 with 500-year flooding maps that were developed under 0, 0.2, and 0.9 m sea level rise. The calibration results that a substantial number of built-up regions extend from established coastal settlements. The predictions suggest that total flooded area of new urbanized regions in 2080 would be more than 25 times that under the flood mitigation policy, if the urbanization progresses with few policy interventions. The joint model generates new knowledge in the domain between land use modeling and sea level rise. It contributes to coastal spatial planning by helping develop hazard mitigation schemes and can be employed in other international communities that face combined pressure of urban growth and climate change.

  3. Building robust functionality in synthetic circuits using engineered feedback regulation.

    PubMed

    Chen, Susan; Harrigan, Patrick; Heineike, Benjamin; Stewart-Ornstein, Jacob; El-Samad, Hana

    2013-08-01

    The ability to engineer novel functionality within cells, to quantitatively control cellular circuits, and to manipulate the behaviors of populations, has many important applications in biotechnology and biomedicine. These applications are only beginning to be explored. In this review, we advocate the use of feedback control as an essential strategy for the engineering of robust homeostatic control of biological circuits and cellular populations. We also describe recent works where feedback control, implemented in silico or with biological components, was successfully employed for this purpose. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying

    2015-06-15

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimatemore » average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern.« less

  5. Measured and Monte Carlo calculated k{sub Q} factors: Accuracy and comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muir, B. R.; McEwen, M. R.; Rogers, D. W. O.

    2011-08-15

    Purpose: The journal Medical Physics recently published two papers that determine beam quality conversion factors, k{sub Q}, for large sets of ion chambers. In the first paper [McEwen Med. Phys. 37, 2179-2193 (2010)], k{sub Q} was determined experimentally, while the second paper [Muir and Rogers Med. Phys. 37, 5939-5950 (2010)] provides k{sub Q} factors calculated using Monte Carlo simulations. This work investigates a variety of additional consistency checks to verify the accuracy of the k{sub Q} factors determined in each publication and a comparison of the two data sets. Uncertainty introduced in calculated k{sub Q} factors by possible variation ofmore » W/e with beam energy is investigated further. Methods: The validity of the experimental set of k{sub Q} factors relies on the accuracy of the NE2571 reference chamber measurements to which k{sub Q} factors for all other ion chambers are correlated. The stability of NE2571 absorbed dose to water calibration coefficients is determined and comparison to other experimental k{sub Q} factors is analyzed. Reliability of Monte Carlo calculated k{sub Q} factors is assessed through comparison to other publications that provide Monte Carlo calculations of k{sub Q} as well as an analysis of the sleeve effect, the effect of cavity length and self-consistencies between graphite-walled Farmer-chambers. Comparison between the two data sets is given in terms of the percent difference between the k{sub Q} factors presented in both publications. Results: Monitoring of the absorbed dose calibration coefficients for the NE2571 chambers over a period of more than 15 yrs exhibit consistency at a level better than 0.1%. Agreement of the NE2571 k{sub Q} factors with a quadratic fit to all other experimental data from standards labs for the same chamber is observed within 0.3%. Monte Carlo calculated k{sub Q} factors are in good agreement with most other Monte Carlo calculated k{sub Q} factors. Expected results are observed for the sleeve effect and the effect of cavity length on k{sub Q}. The mean percent differences between experimental and Monte Carlo calculated k{sub Q} factors are -0.08, -0.07, and -0.23% for the Elekta 6, 10, and 25 MV nominal beam energies, respectively. An upper limit on the variation of W/e in photon beams from cobalt-60 to 25 MV is determined as 0.4% with 95% confidence. The combined uncertainty on Monte Carlo calculated k{sub Q} factors is reassessed and amounts to between 0.40 and 0.49% depending on the wall material of the chamber. Conclusions: Excellent agreement (mean percent difference of only 0.13% for the entire data set) between experimental and calculated k{sub Q} factors is observed. For some chambers, k{sub Q} is measured for only one chamber of each type--the level of agreement observed in this study would suggest that for those chambers the measured k{sub Q} values are generally representative of the chamber type.« less

  6. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    PubMed Central

    Lai, Chao-Jen; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C.

    2015-01-01

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm2 field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern. PMID:26127058

  7. A hybrid multiscale Monte Carlo algorithm (HyMSMC) to cope with disparity in time scales and species populations in intracellular networks.

    PubMed

    Samant, Asawari; Ogunnaike, Babatunde A; Vlachos, Dionisios G

    2007-05-24

    The fundamental role that intrinsic stochasticity plays in cellular functions has been shown via numerous computational and experimental studies. In the face of such evidence, it is important that intracellular networks are simulated with stochastic algorithms that can capture molecular fluctuations. However, separation of time scales and disparity in species population, two common features of intracellular networks, make stochastic simulation of such networks computationally prohibitive. While recent work has addressed each of these challenges separately, a generic algorithm that can simultaneously tackle disparity in time scales and population scales in stochastic systems is currently lacking. In this paper, we propose the hybrid, multiscale Monte Carlo (HyMSMC) method that fills in this void. The proposed HyMSMC method blends stochastic singular perturbation concepts, to deal with potential stiffness, with a hybrid of exact and coarse-grained stochastic algorithms, to cope with separation in population sizes. In addition, we introduce the computational singular perturbation (CSP) method as a means of systematically partitioning fast and slow networks and computing relaxation times for convergence. We also propose a new criteria of convergence of fast networks to stochastic low-dimensional manifolds, which further accelerates the algorithm. We use several prototype and biological examples, including a gene expression model displaying bistability, to demonstrate the efficiency, accuracy and applicability of the HyMSMC method. Bistable models serve as stringent tests for the success of multiscale MC methods and illustrate limitations of some literature methods.

  8. The “Electrostatic-Switch” Mechanism: Monte Carlo Study of MARCKS-Membrane Interaction

    PubMed Central

    Tzlil, Shelly; Murray, Diana; Ben-Shaul, Avinoam

    2008-01-01

    The binding of the myristoylated alanine-rich C kinase substrate (MARCKS) to mixed, fluid, phospholipid membranes is modeled with a recently developed Monte Carlo simulation scheme. The central domain of MARCKS is both basic (ζ = +13) and hydrophobic (five Phe residues), and is flanked with two long chains, one ending with the myristoylated N-terminus. This natively unfolded protein is modeled as a flexible chain of “beads” representing the amino acid residues. The membranes contain neutral (ζ = 0), monovalent (ζ = −1), and tetravalent (ζ = −4) lipids, all of which are laterally mobile. MARCKS-membrane interaction is modeled by Debye-Hückel electrostatic potentials and semiempirical hydrophobic energies. In agreement with experiment, we find that membrane binding is mediated by electrostatic attraction of the basic domain to acidic lipids and membrane penetration of its hydrophobic moieties. The binding is opposed by configurational entropy losses and electrostatic membrane repulsion of the two long chains, and by lipid demixing upon adsorption. The simulations provide a physical model for how membrane-adsorbed MARCKS attracts several PIP2 lipids (ζ = −4) to its vicinity, and how phosphorylation of the central domain (ζ = +13 to ζ = +7) triggers an “electrostatic switch”, which weakens both the membrane interaction and PIP2 sequestration. This scheme captures the essence of “discreteness of charge” at membrane surfaces and can examine the formation of membrane-mediated multicomponent macromolecular complexes that function in many cellular processes. PMID:18502797

  9. Evaluating 99mTc Auger electrons for targeted tumor radiotherapy by computational methods.

    PubMed

    Tavares, Adriana Alexandre S; Tavares, João Manuel R S

    2010-07-01

    Technetium-99m (99mTc) has been widely used as an imaging agent but only recently has been considered for therapeutic applications. This study aims to analyze the potential use of 99mTc Auger electrons for targeted tumor radiotherapy by evaluating the DNA damage and its probability of correct repair and by studying the cellular kinetics, following 99mTc Auger electron irradiation in comparison to iodine-131 (131I) beta minus particles and astatine-211 (211At) alpha particle irradiation. Computational models were used to estimate the yield of DNA damage (fast Monte Carlo damage algorithm), the probability of correct repair (Monte Carlo excision repair algorithm), and cell kinetic effects (virtual cell radiobiology algorithm) after irradiation with the selected particles. The results obtained with the algorithms used suggested that 99mTc CKMMX (all M-shell Coster-Kroning--CK--and super-CK transitions) electrons and Auger MXY (all M-shell Auger transitions) have a therapeutic potential comparable to high linear energy transfer 211At alpha particles and higher than 131I beta minus particles. All the other 99mTc electrons had a therapeutic potential similar to 131I beta minus particles. 99mTc CKMMX electrons and Auger MXY presented a higher probability to induce apoptosis than 131I beta minus particles and a probability similar to 211At alpha particles. Based on the results here, 99mTc CKMMX electrons and Auger MXY are useful electrons for targeted tumor radiotherapy.

  10. Microbiology Meets Archaeology: Soil Microbial Communities Reveal Different Human Activities at Archaic Monte Iato (Sixth Century BC).

    PubMed

    Margesin, Rosa; Siles, José A; Cajthaml, Tomas; Öhlinger, Birgit; Kistler, Erich

    2017-05-01

    Microbial ecology has been recognized as useful in archaeological studies. At Archaic Monte Iato in Western Sicily, a native (indigenous) building was discovered. The objective of this study was the first examination of soil microbial communities related to this building. Soil samples were collected from archaeological layers at a ritual deposit (food waste disposal) in the main room and above the fireplace in the annex. Microbial soil characterization included abundance (cellular phospholipid fatty acids (PLFA), viable bacterial counts), activity (physiological profiles, enzyme activities of viable bacteria), diversity, and community structure (bacterial and fungal Illumina amplicon sequencing, identification of viable bacteria). PLFA-derived microbial abundance was lower in soils from the fireplace than in soils from the deposit; the opposite was observed with culturable bacteria. Microbial communities in soils from the fireplace had a higher ability to metabolize carboxylic and acetic acids, while those in soils from the deposit metabolized preferentially carbohydrates. The lower deposit layer was characterized by higher total microbial and bacterial abundance and bacterial richness and by a different carbohydrate metabolization profile compared to the upper deposit layer. Microbial community structures in the fireplace were similar and could be distinguished from those in the two deposit layers, which had different microbial communities. Our data confirmed our hypothesis that human consumption habits left traces on microbiota in the archaeological evidence; therefore, microbiological residues as part of the so-called ecofacts are, like artifacts, key indicators of consumer behavior in the past.

  11. A trans-dimensional Bayesian Markov chain Monte Carlo algorithm for model assessment using frequency-domain electromagnetic data

    USGS Publications Warehouse

    Minsley, Burke J.

    2011-01-01

    A meaningful interpretation of geophysical measurements requires an assessment of the space of models that are consistent with the data, rather than just a single, ‘best’ model which does not convey information about parameter uncertainty. For this purpose, a trans-dimensional Bayesian Markov chain Monte Carlo (MCMC) algorithm is developed for assessing frequencydomain electromagnetic (FDEM) data acquired from airborne or ground-based systems. By sampling the distribution of models that are consistent with measured data and any prior knowledge, valuable inferences can be made about parameter values such as the likely depth to an interface, the distribution of possible resistivity values as a function of depth and non-unique relationships between parameters. The trans-dimensional aspect of the algorithm allows the number of layers to be a free parameter that is controlled by the data, where models with fewer layers are inherently favoured, which provides a natural measure of parsimony and a significant degree of flexibility in parametrization. The MCMC algorithm is used with synthetic examples to illustrate how the distribution of acceptable models is affected by the choice of prior information, the system geometry and configuration and the uncertainty in the measured system elevation. An airborne FDEM data set that was acquired for the purpose of hydrogeological characterization is also studied. The results compare favorably with traditional least-squares analysis, borehole resistivity and lithology logs from the site, and also provide new information about parameter uncertainty necessary for model assessment.

  12. The detective quantum efficiency of photon-counting x-ray detectors using cascaded-systems analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanguay, Jesse; Yun, Seungman; School of Mechanical Engineering, Pusan National University, Jangjeon-dong, Geumjeong-gu, Busan 609-735

    Purpose: Single-photon counting (SPC) x-ray imaging has the potential to improve image quality and enable new advanced energy-dependent methods. The purpose of this study is to extend cascaded-systems analyses (CSA) to the description of image quality and the detective quantum efficiency (DQE) of SPC systems. Methods: Point-process theory is used to develop a method of propagating the mean signal and Wiener noise-power spectrum through a thresholding stage (required to identify x-ray interaction events). The new transfer relationships are used to describe the zero-frequency DQE of a hypothetical SPC detector including the effects of stochastic conversion of incident photons to secondarymore » quanta, secondary quantum sinks, additive noise, and threshold level. Theoretical results are compared with Monte Carlo calculations assuming the same detector model. Results: Under certain conditions, the CSA approach can be applied to SPC systems with the additional requirement of propagating the probability density function describing the total number of image-forming quanta through each stage of a cascaded model. Theoretical results including DQE show excellent agreement with Monte Carlo calculations under all conditions considered. Conclusions: Application of the CSA method shows that false counts due to additive electronic noise results in both a nonlinear image signal and increased image noise. There is a window of allowable threshold values to achieve a high DQE that depends on conversion gain, secondary quantum sinks, and additive noise.« less

  13. Users guide to E859 phoswich analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costales, J.B.

    1992-11-30

    In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less

  14. Membrane-targeting liquid crystal nanoparticles (LCNPs) for drug delivery

    NASA Astrophysics Data System (ADS)

    Nag, Okhil K.; Naciri, Jawad; Spillmann, Christopher M.; Delehanty, James B.

    2016-03-01

    In addition to maintaining the structural integrity of the cell, the plasma membrane regulates multiple important cellular processes, such as endocytosis and trafficking, apoptotic pathways and drug transport. The modulation or tracking of such cellular processes by means of controlled delivery of drugs or imaging agents via nanoscale delivery systems is very attractive. Nanoparticle-mediated delivery systems that mediate long-term residence (e.g., days) and controlled release of the cargoes in the plasma membrane while simultaneously not interfering with regular cellular physiology would be ideal for this purpose. Our laboratory has developed a plasma membrane-targeted liquid crystal nanoparticle (LCNP) formulation that can be loaded with dyes or drugs which can be slowly released from the particle over time. Here we highlight the utility of these nanopreparations for membrane delivery and imaging.

  15. A dynamic Monte Carlo model for predicting radiant exposure distribution in dental composites: model development and verifications

    NASA Astrophysics Data System (ADS)

    Chen, Yin-Chu; Ferracane, Jack L.; Prahl, Scott A.

    2005-03-01

    Photo-cured dental composites are widely used in dental practices to restore teeth due to the esthetic appearance of the composites and the ability to cure in situ. However, their complex optical characteristics make it difficult to understand the light transport within the composites and to predict the depth of cure. Our previous work showed that the absorption and scattering coefficients of the composite changed after the composite was cured. The static Monte Carlo simulation showed that the penetration of radiant exposures differed significantly for cured and uncured optical properties. This means that a dynamic model is required for accurate prediction of radiant exposure in the composites. The purpose of this study was to develop and verify a dynamic Monte Carlo (DMC) model simulating light propagation in dental composites that have dynamic optical properties while photons are absorbed. The composite was divided into many small cubes, each of which had its own scattering and absorption coefficients. As light passed through the composite, the light was scattered and absorbed. The amount of light absorbed in each cube was calculated using Beer's Law and was used to determine the next optical properties in that cube. Finally, the predicted total reflectance and transmittance as well as the optical property during curing were verified numerically and experimentally. Our results showed that the model predicted values agreed with the theoretical values within 1% difference. The DMC model results are comparable with experimental results within 5% differences.

  16. Validation of Direct Normal Irradiance from Meteosat Second Generation

    NASA Astrophysics Data System (ADS)

    Meyer, Angela; Stöckli, Reto; Vuilleumier, Laurent; Wilbert, Stefan; Zarzalejo, Luis

    2016-04-01

    We present a validation study of Direct Normal Irradiance (DNI) derived from MSG/SEVIRI radiance measurements over the site of Plataforma Solar de Almeria (PSA), a solar power plant in Southern Spain. The 1 km x 1 km site of PSA hosts about a dozen pyrheliometers operated by the German Aerospace Centre (DLR) and the Centre for Energy, Environment and Technological Research (CIEMAT). They provide high-quality long-term measurements of surface DNI on a site of the scale of the MSG/SEVIRI pixel resolution. This makes the PSA DNI measurements a dataset particularly well suited for satellite validation purposes. The satellite-based surface DNI was retrieved from MSG/SEVIRI radiances by the HelioMont algorithm (Stöckli 2013) that forms part of the Heliosat algorithm family (e.g. Müller et al., 2004). We have assessed the accuracy of this DNI product for the PSA site by comparing with the in-situ measured DNIs of June 2014 - July 2015. Despite a generally good agreement, the HelioMont DNI exhibits a significant low bias at the PSA site, that is most pronounced during clear-sky periods. We present a bias correction method and discuss (1) the role of circumsolar diffuse radiation and (2) the role of climatological vs. reanalysis-based aerosol optical properties therein. We also characterize and assess the temporal variability of the HelioMont DNI as compared to the in situ measured DNIs, and will discuss and quantify the uncertainties in both DNI datasets.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, S; Ji, Y; Kim, K

    Purpose: A diagnostics Multileaf Collimator (MLC) was designed for diagnostic radiography dose reduction. Monte Carlo simulation was used to evaluate efficiency of shielding material for producing leaves of Multileaf collimator. Material & Methods: The general radiography unit (Rex-650R, Listem, Korea) was modeling with Monte Carlo simulation (MCNPX, LANL, USA) and we used SRS-78 program to calculate the energy spectrum of tube voltage (80, 100, 120 kVp). The shielding materials was SKD 11 alloy tool steel that is composed of 1.6% carbon(C), 0.4% silicon (Si), 0.6% manganese (Mn), 5% chromium (Cr), 1% molybdenum (Mo), and vanadium (V). The density of itmore » was 7.89 g/m3. We simulated leafs diagnostic MLC using SKD 11 with general radiography unit. We calculated efficiency of diagnostic MLC using tally6 card of MCNPX depending on energy. Results: The diagnostic MLC consisted of 25 individual metal shielding leaves on both sides, with dimensions of 10 × 0.5 × 0.5 cm3. The leaves of MLC were controlled by motors positioned on both sides of the MLC. According to energy (tube voltage), the shielding efficiency of MLC in Monte Carlo simulation was 99% (80 kVp), 96% (100 kVp) and 93% (120 kVp). Conclusion: We certified efficiency of diagnostic MLC fabricated from SKD11 alloy tool steel. Based on the results, the diagnostic MLC was designed. We will make the diagnostic MLC for dose reduction of diagnostic radiography.« less

  18. Forward Neutron Production at the Fermilab Main Injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nigmanov, T.S.; /Michigan U.; Rajaram, D.

    2010-10-01

    We have measured cross sections for forward neutron production from a variety of targets using proton beams from the Fermilab Main Injector. Measurements were performed for proton beam momenta of 58 GeV/c, 84 GeV/c, and 120 GeV/c. The cross section dependence on the atomic weight (A) of the targets was found to vary as A{sup a} where a is 0.46 {+-} 0.06 for a beam momentum of 58 GeV/c and 0.54 {+-} 0.05 for 120 GeV/c. The cross sections show reasonable agreement with FLUKA and DPMJET Monte Carlos. Comparisons have also been made with the LAQGSM Monte Carlo. The MIPPmore » (Main Injector Particle Production) experiment (FNAL E907) [1] acquired data in the Meson Center beam line at Fermilab. The primary purposes of the experiment were to investigate scaling laws in hadron fragmentation [2], to obtain hadron production data for the NuMI (Neutrinos at the Main Injector [3]) target to be used for calculating neutrino fluxes, and to obtain inclusive pion, neutron, and photon production data to facilitate proton radiography [4]. While there is considerable data available on inclusive charged particle production [5], there is little data on neutron production. In this article we present results for forward neutron production using proton beams of 58 GeV/c, 84 GeV/c, and 120 GeV/c on hydrogen, beryllium, carbon, bismuth, and uranium targets, and compare these data with predictions from Monte Carlo simulations.« less

  19. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taleei, R; Qin, N; Jiang, S

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relativemore » Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.« less

  20. Modeling radiation loads in the ILC main linac and a novel approach to treat dark current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, Nilolai V.; Rakhno, Igor L.; Tropin, Igor S.

    Electromagnetic and hadron showers generated by electrons of dark current (DC) can represent a significant radiation threat to the ILC linac equipment and personnel. In this study, a commissioning scenario is analysed which is considered as the worst-case scenario for the main linac regarding the DC contribution to the radiation environment in the tunnel. A normal operation scenario is analysed as well. An emphasis is made on radiation load to sensitive electronic equipment—cryogenic thermometers inside the cryomodules. Prompt and residual dose rates in the ILC main linac tunnels were also calculated in these new high-statistics runs. A novel approach wasmore » developed—as a part of general purpose Monte Carlo code MARS15—to model generation, acceleration and transport of DC electrons in electromagnetic fields inside SRF cavities. Comparisons were made with a standard approach when a set of pre-calculated DC electron trajectories is used, with a proper normalization, as a source for Monte Carlo modelling. Results of MARS15 Monte Carlo calculations, performed for the current main linac tunnel design, reveal that the peak absorbed dose in the cryogenic thermometers in the main tunnel for 20 years of operation is about 0.8 MGy. The calculated contact residual dose on cryomodules and tunnel walls in the main tunnel for typical irradiation and cooling conditions is 0.1 and 0.01 mSv/hr, respectively.« less

  1. SU-F-T-370: A Fast Monte Carlo Dose Engine for Gamma Knife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, T; Zhou, L; Li, Y

    2016-06-15

    Purpose: To develop a fast Monte Carlo dose calculation algorithm for Gamma Knife. Methods: To make the simulation more efficient, we implemented the track repeating technique on GPU. We first use EGSnrc to pre-calculate the photon and secondary electron tracks in water from two mono-energy photons of 60Co. The total photon mean free paths for different materials and energies are obtained from NIST. During simulation, each entire photon track was first loaded to shared memory for each block, the incident original photon was then splitted to Nthread sub-photons, each thread transport one sub-photon, the Russian roulette technique was applied formore » scattered and bremsstrahlung photons. The resultant electrons from photon interactions are simulated by repeating the recorded electron tracks. The electron step length is stretched/shrunk proportionally based on the local density and stopping power ratios of the local material. Energy deposition in a voxel is proportional to the fraction of the equivalent step length in that voxel. To evaluate its accuracy, dose deposition in a 300mm*300mm*300mm water phantom is calculated, and compared to EGSnrc results. Results: Both PDD and OAR showed great agreements (within 0.5%) between our dose engine result and the EGSnrc result. It only takes less than 1 min for every simulation, being reduced up to ∼40 times compared to EGSnrc simulations. Conclusion: We have successfully developed a fast Monte Carlo dose engine for Gamma Knife.« less

  2. Actual geomorphological processes on steep hillslope vineyards. A comparison of Ruwertal (Germany) with the Montes de Málaga (Spain).

    NASA Astrophysics Data System (ADS)

    Rodrigo Comino, Jesús; Damián Ruiz Sinoga, José; María Senciales González, José; Guerra Merchán, Antonio; Seeger, Manuel; Ries, Johannes B.

    2016-04-01

    Nowadays, steep hillslope viticulture areas are one of the most complex agricultural eco-geomorphological systems in Europe. Precisely, the vineyards of the Ruwer-Mosel valley (Germany) and Montes de Málaga-Axarquía (Spain) are one clear example. Both regions are characterized by frequent heavy rainfall events, concentrated in summer (Germany) and autumn-winter (Spain), and intensive and not conservative land use managements on the soil (application of vine training systems, herbicides, non ecological amendments, anthropic rills generated by wheel traffic, footsteps in Germany and built by hoes or shovels in Spain). The goals of this work were: i) to determine and to quantify the hydrological and erosive phenomena in two traditional hillslope vineyards in Waldrach (Ruwer-Mosel valley, Germany) and Almáchar (Montes de Málaga-Axarquía, Spain); ii) to compare the geomorphological and hydrological dynamics of these study areas during diverse seasons and under different management conditions (Mediterranean and Continental climatic contexts, application of machineries, traditional protection measures...). For this purpose, a combined methodology performed by Trier and Málaga Universities with soil analysis, sediment traps, rainfall simulations and Guelph permeameter were applied. The main results showed high soil erosion and similar variations in the runoff and infiltration rates. In both study areas, geomorphological and hydrological dynamics registered several spatiotemporal variations along the upper, middle and foot slope, and during different seasons (before and after the vintage, and between the dry and humid period).

  3. Field evaluation of three joint sealants.

    DOT National Transportation Integrated Search

    1987-01-01

    The purpose of the study reported here was to evaluate the performance of three joint sealants compartmented (A) and closed cellular (B) preformed neoprene, and a two-component cold-mixed polysulfide (C)-- that were used in the interchanges for Inter...

  4. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection

    PubMed Central

    Choudhry, Priya

    2016-01-01

    Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849

  5. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    NASA Technical Reports Server (NTRS)

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to run on any engineer's desktop computer.

  6. Monte Carlo simulations of adult and pediatric computed tomography exams: Validation studies of organ doses with physical phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Daniel J.; Lee, Choonsik; Tien, Christopher

    2013-01-15

    Purpose: To validate the accuracy of a Monte Carlo source model of the Siemens SOMATOM Sensation 16 CT scanner using organ doses measured in physical anthropomorphic phantoms. Methods: The x-ray output of the Siemens SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code, MCNPX version 2.6. The resulting source model was able to perform various simulated axial and helical computed tomographic (CT) scans of varying scan parameters, including beam energy, filtration, pitch, and beam collimation. Two custom-built anthropomorphic phantoms were used to take dose measurements on the CT scanner: an adult male and amore » 9-month-old. The adult male is a physical replica of University of Florida reference adult male hybrid computational phantom, while the 9-month-old is a replica of University of Florida Series B 9-month-old voxel computational phantom. Each phantom underwent a series of axial and helical CT scans, during which organ doses were measured using fiber-optic coupled plastic scintillator dosimeters developed at University of Florida. The physical setup was reproduced and simulated in MCNPX using the CT source model and the computational phantoms upon which the anthropomorphic phantoms were constructed. Average organ doses were then calculated based upon these MCNPX results. Results: For all CT scans, good agreement was seen between measured and simulated organ doses. For the adult male, the percent differences were within 16% for axial scans, and within 18% for helical scans. For the 9-month-old, the percent differences were all within 15% for both the axial and helical scans. These results are comparable to previously published validation studies using GE scanners and commercially available anthropomorphic phantoms. Conclusions: Overall results of this study show that the Monte Carlo source model can be used to accurately and reliably calculate organ doses for patients undergoing a variety of axial or helical CT examinations on the Siemens SOMATOM Sensation 16 scanner.« less

  7. SU-E-T-285: Dose Variation at Bone in Small-Animal Irradiation: A Monte Carlo Study Using Monoenergetic Photon Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vuong, A; Chow, J

    Purpose: The aim of this study is to investigate the variation of bone dose on photon beam energy (keV – MeV) in small-animal irradiation. Dosimetry of homogeneous and inhomogeneous phantoms as per the same mouse computed tomography image set were calculated using the DOSCTP and DOSXYZnrc based on the EGSnrc Monte Carlo code. Methods: Monte Carlo simulations for the homogeneous and inhomogeneous mouse phantom irradiated by a 360 degree photon arc were carried out. Mean doses of the bone tissue in the irradiated volumes were calculated at various photon beam energies, ranging from 50 keV to 1.25 MeV. The effectmore » of bone inhomogeneity was examined through the Inhomogeneous Correction Factor (ICF), a dose ratio of the inhomogeneous to the homogeneous medium. Results: From our Monte Carlo results, higher mean bone dose and ICF were found when using kilovoltage photon beams compared to megavoltage. In beam energies ranging from 50 keV to 200 keV, the bone dose was found maximum at 50 keV, and decreased significantly from 2.6 Gy to 0.55 Gy, when 2 Gy was delivered at the center of the phantom (isocenter). Similarly, the ICF were found decreasing from 4.5 to 1 when the photon beam energy was increased from 50 keV to 200 keV. Both mean bone dose and ICF remained at about 0.5 Gy and 1 from 200 keV to 1.25 MeV with insignificant variation, respectively. Conclusion: It is concluded that to avoid high bone dose in the small-animal irradiation, photon beam energy higher than 200 keV should be used with the ICF close to one, and bone dose comparable to the megavoltage beam where photoelectric effect is not dominant.« less

  8. SU-E-T-556: Monte Carlo Generated Dose Distributions for Orbital Irradiation Using a Single Anterior-Posterior Electron Beam and a Hanging Lens Shield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duwel, D; Lamba, M; Elson, H

    Purpose: Various cancers of the eye are successfully treated with radiotherapy utilizing one anterior-posterior (A/P) beam that encompasses the entire content of the orbit. In such cases, a hanging lens shield can be used to spare dose to the radiosensitive lens of the eye to prevent cataracts. Methods: This research focused on Monte Carlo characterization of dose distributions resulting from a single A-P field to the orbit with a hanging shield in place. Monte Carlo codes were developed which calculated dose distributions for various electron radiation energies, hanging lens shield radii, shield heights above the eye, and beam spoiler configurations.more » Film dosimetry was used to benchmark the coding to ensure it was calculating relative dose accurately. Results: The Monte Carlo dose calculations indicated that lateral and depth dose profiles are insensitive to changes in shield height and electron beam energy. Dose deposition was sensitive to shield radius and beam spoiler composition and height above the eye. Conclusion: The use of a single A/P electron beam to treat cancers of the eye while maintaining adequate lens sparing is feasible. Shield radius should be customized to have the same radius as the patient’s lens. A beam spoiler should be used if it is desired to substantially dose the eye tissues lying posterior to the lens in the shadow of the lens shield. The compromise between lens sparing and dose to diseased tissues surrounding the lens can be modulated by varying the beam spoiler thickness, spoiler material composition, and spoiler height above the eye. The sparing ratio is a metric that can be used to evaluate the compromise between lens sparing and dose to surrounding tissues. The higher the ratio, the more dose received by the tissues immediately posterior to the lens relative to the dose received by the lens.« less

  9. Patient-specific radiation dose and cancer risk estimation in CT: Part I. Development and validation of a Monte Carlo program

    PubMed Central

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-01

    Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by −4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (−8.1%, 8.1%) and (−17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose estimates for specific patients. PMID:21361208

  10. SU-C-BRC-05: Monte Carlo Calculations to Establish a Simple Relation of Backscatter Dose Enhancement Around High-Z Dental Alloy to Its Atomic Number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utsunomiya, S; Kushima, N; Katsura, K

    Purpose: To establish a simple relation of backscatter dose enhancement around a high-Z dental alloy in head and neck radiation therapy to its average atomic number based on Monte Carlo calculations. Methods: The PHITS Monte Carlo code was used to calculate dose enhancement, which is quantified by the backscatter dose factor (BSDF). The accuracy of the beam modeling with PHITS was verified by comparing with basic measured data namely PDDs and dose profiles. In the simulation, a high-Z alloy of 1 cm cube was embedded into a tough water phantom irradiated by a 6-MV (nominal) X-ray beam of 10 cmmore » × 10 cm field size of Novalis TX (Brainlab). The ten different materials of high-Z alloys (Al, Ti, Cu, Ag, Au-Pd-Ag, I, Ba, W, Au, Pb) were considered. The accuracy of calculated BSDF was verified by comparing with measured data by Gafchromic EBT3 films placed at from 0 to 10 mm away from a high-Z alloy (Au-Pd-Ag). We derived an approximate equation to determine the relation of BSDF and range of backscatter to average atomic number of high-Z alloy. Results: The calculated BSDF showed excellent agreement with measured one by Gafchromic EBT3 films at from 0 to 10 mm away from the high-Z alloy. We found the simple linear relation of BSDF and range of backscatter to average atomic number of dental alloys. The latter relation was proven by the fact that energy spectrum of backscatter electrons strongly depend on average atomic number. Conclusion: We found a simple relation of backscatter dose enhancement around high-Z alloys to its average atomic number based on Monte Carlo calculations. This work provides a simple and useful method to estimate backscatter dose enhancement from dental alloys and corresponding optimal thickness of dental spacer to prevent mucositis effectively.« less

  11. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular andmore » thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system.« less

  12. SU-E-T-469: A Practical Approach for the Determination of Small Field Output Factors Using Published Monte Carlo Derived Correction Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderon, E; Siergiej, D

    2014-06-01

    Purpose: Output factor determination for small fields (less than 20 mm) presents significant challenges due to ion chamber volume averaging and diode over-response. Measured output factor values between detectors are known to have large deviations as field sizes are decreased. No set standard to resolve this difference in measurement exists. We observed differences between measured output factors of up to 14% using two different detectors. Published Monte Carlo derived correction factors were used to address this challenge and decrease the output factor deviation between detectors. Methods: Output factors for Elekta's linac-based stereotactic cone system were measured using the EDGE detectormore » (Sun Nuclear) and the A16 ion chamber (Standard Imaging). Measurements conditions were 100 cm SSD (source to surface distance) and 1.5 cm depth. Output factors were first normalized to a 10.4 cm × 10.4 cm field size using a daisy-chaining technique to minimize the dependence of field size on detector response. An equation expressing the relation between published Monte Carlo correction factors as a function of field size for each detector was derived. The measured output factors were then multiplied by the calculated correction factors. EBT3 gafchromic film dosimetry was used to independently validate the corrected output factors. Results: Without correction, the deviation in output factors between the EDGE and A16 detectors ranged from 1.3 to 14.8%, depending on cone size. After applying the calculated correction factors, this deviation fell to 0 to 3.4%. Output factors determined with film agree within 3.5% of the corrected output factors. Conclusion: We present a practical approach to applying published Monte Carlo derived correction factors to measured small field output factors for the EDGE and A16 detectors. Using this method, we were able to decrease the percent deviation between both detectors from 14.8% to 3.4% agreement.« less

  13. An Improved Method of Heterogeneity Compensation for the Convolution / Superposition Algorithm

    NASA Astrophysics Data System (ADS)

    Jacques, Robert; McNutt, Todd

    2014-03-01

    Purpose: To improve the accuracy of convolution/superposition (C/S) in heterogeneous material by developing a new algorithm: heterogeneity compensated superposition (HCS). Methods: C/S has proven to be a good estimator of the dose deposited in a homogeneous volume. However, near heterogeneities electron disequilibrium occurs, leading to the faster fall-off and re-buildup of dose. We propose to filter the actual patient density in a position and direction sensitive manner, allowing the dose deposited near interfaces to be increased or decreased relative to C/S. We implemented the effective density function as a multivariate first-order recursive filter and incorporated it into GPU-accelerated, multi-energetic C/S implementation. We compared HCS against C/S using the ICCR 2000 Monte-Carlo accuracy benchmark, 23 similar accuracy benchmarks and 5 patient cases. Results: Multi-energetic HCS increased the dosimetric accuracy for the vast majority of voxels; in many cases near Monte-Carlo results were achieved. We defined the per-voxel error, %|mm, as the minimum of the distance to agreement in mm and the dosimetric percentage error relative to the maximum MC dose. HCS improved the average mean error by 0.79 %|mm for the patient volumes; reducing the average mean error from 1.93 %|mm to 1.14 %|mm. Very low densities (i.e. < 0.1 g / cm3) remained problematic, but may be solvable with a better filter function. Conclusions: HCS improved upon C/S's density scaled heterogeneity correction with a position and direction sensitive density filter. This method significantly improved the accuracy of the GPU based algorithm reaching the accuracy levels of Monte Carlo based methods with performance in a few tenths of seconds per beam. Acknowledgement: Funding for this research was provided by the NSF Cooperative Agreement EEC9731748, Elekta / IMPAC Medical Systems, Inc. and the Johns Hopkins University. James Satterthwaite provided the Monte Carlo benchmark simulations.

  14. SU-F-T-270: A Technique for Modeling a Diode Array Into the TPS for Lung SBRT Patient Specific QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curley, C; Leventouri, T; Ouhib, Z

    2016-06-15

    Purpose: To accurately match the treatment planning system (TPS) with the measurement environment, where quality assurance (QA) devices are used to collect data, for lung Stereotactic Body Radiation Therapy (SBRT) patient specific QA. Incorporation of heterogeneities is also studied. Methods: Dual energy computerized tomography (DECT) and single energy computerized tomography (SECT) were used to model phantoms incorporating a 2-D diode array into the TPS. A water-equivalent and a heterogeneous phantom (simulating the thoracic region of a patient) were studied. Monte Carlo and pencil beam planar dose distributions were compared to measured distributions. Composite and individual fields were analyzed for normallymore » incident and planned gantry angle deliveries. γ- analysis was used with criteria 3% 3mm, 2% 2mm, and 1% 1mm. Results: The Monte Carlo calculations for the DECT resulted in improved agreements with the diode array for 46.4% of the fields at 3% 3mm, 85.7% at 2% 2mm, and 92.9% at 1% 1mm.For the SECT, the Monte Carlo calculations gave no agreement for the same γ-analysis criteria. Pencil beam calculations resulted in lower agreements with the diode array in the TPS. The DECT showed improvements for 14.3% of the fields at 3% 3mm and 2% 2mm, and 28.6% at 1% 1mm.In SECT comparisons, 7.1% of the fields at 3% 3mm, 10.7% at 2% 2mm, and 17.9% at 1% 1mm showed improved agreements with the diode array. Conclusion: This study demonstrates that modeling the diode array in the TPS is viable using DECT with Monte Carlo for patient specific lung SBRT QA. As recommended by task groups (e.g. TG 65, TG 101, TG 244) of the American Association of Physicists in Medicine (AAPM), pencil beam algorithms should be avoided in the presence of heterogeneous materials, including a diode array.« less

  15. WE-DE-201-05: Evaluation of a Windowless Extrapolation Chamber Design and Monte Carlo Based Corrections for the Calibration of Ophthalmic Applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, J; Culberson, W; DeWerd, L

    Purpose: To test the validity of a windowless extrapolation chamber used to measure surface dose rate from planar ophthalmic applicators and to compare different Monte Carlo based codes for deriving correction factors. Methods: Dose rate measurements were performed using a windowless, planar extrapolation chamber with a {sup 90}Sr/{sup 90}Y Tracerlab RA-1 ophthalmic applicator previously calibrated at the National Institute of Standards and Technology (NIST). Capacitance measurements were performed to estimate the initial air gap width between the source face and collecting electrode. Current was measured as a function of air gap, and Bragg-Gray cavity theory was used to calculate themore » absorbed dose rate to water. To determine correction factors for backscatter, divergence, and attenuation from the Mylar entrance window found in the NIST extrapolation chamber, both EGSnrc Monte Carlo user code and Monte Carlo N-Particle Transport Code (MCNP) were utilized. Simulation results were compared with experimental current readings from the windowless extrapolation chamber as a function of air gap. Additionally, measured dose rate values were compared with the expected result from the NIST source calibration to test the validity of the windowless chamber design. Results: Better agreement was seen between EGSnrc simulated dose results and experimental current readings at very small air gaps (<100 µm) for the windowless extrapolation chamber, while MCNP results demonstrated divergence at these small gap widths. Three separate dose rate measurements were performed with the RA-1 applicator. The average observed difference from the expected result based on the NIST calibration was −1.88% with a statistical standard deviation of 0.39% (k=1). Conclusion: EGSnrc user code will be used during future work to derive correction factors for extrapolation chamber measurements. Additionally, experiment results suggest that an entrance window is not needed in order for an extrapolation chamber to provide accurate dose rate measurements for a planar ophthalmic applicator.« less

  16. SU-G-TeP3-14: Three-Dimensional Cluster Model in Inhomogeneous Dose Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, J; Penagaricano, J; Narayanasamy, G

    2016-06-15

    Purpose: We aim to investigate 3D cluster formation in inhomogeneous dose distribution to search for new models predicting radiation tissue damage and further leading to new optimization paradigm for radiotherapy planning. Methods: The aggregation of higher dose in the organ at risk (OAR) than a preset threshold was chosen as the cluster whose connectivity dictates the cluster structure. Upon the selection of the dose threshold, the fractional density defined as the fraction of voxels in the organ eligible to be part of the cluster was determined according to the dose volume histogram (DVH). A Monte Carlo method was implemented tomore » establish a case pertinent to the corresponding DVH. Ones and zeros were randomly assigned to each OAR voxel with the sampling probability equal to the fractional density. Ten thousand samples were randomly generated to ensure a sufficient number of cluster sets. A recursive cluster searching algorithm was developed to analyze the cluster with various connectivity choices like 1-, 2-, and 3-connectivity. The mean size of the largest cluster (MSLC) from the Monte Carlo samples was taken to be a function of the fractional density. Various OARs from clinical plans were included in the study. Results: Intensive Monte Carlo study demonstrates the inverse relationship between the MSLC and the cluster connectivity as anticipated and the cluster size does not change with fractional density linearly regardless of the connectivity types. An initially-slow-increase to exponential growth transition of the MSLC from low to high density was observed. The cluster sizes were found to vary within a large range and are relatively independent of the OARs. Conclusion: The Monte Carlo study revealed that the cluster size could serve as a suitable index of the tissue damage (percolation cluster) and the clinical outcome of the same DVH might be potentially different.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, D; O’Connell, D; Lamb, J

    Purpose: To demonstrate real-time dose calculation of free-breathing MRI guided Co−60 treatments, using a motion model and Monte-Carlo dose calculation to accurately account for the interplay between irregular breathing motion and an IMRT delivery. Methods: ViewRay Co-60 dose distributions were optimized on ITVs contoured from free-breathing CT images of lung cancer patients. Each treatment plan was separated into 0.25s segments, accounting for the MLC positions and beam angles at each time point. A voxel-specific motion model derived from multiple fast-helical free-breathing CTs and deformable registration was calculated for each patient. 3D images for every 0.25s of a simulated treatment weremore » generated in real time, here using a bellows signal as a surrogate to accurately account for breathing irregularities. Monte-Carlo dose calculation was performed every 0.25s of the treatment, with the number of histories in each calculation scaled to give an overall 1% statistical uncertainty. Each dose calculation was deformed back to the reference image using the motion model and accumulated. The static and real-time dose calculations were compared. Results: Image generation was performed in real time at 4 frames per second (GPU). Monte-Carlo dose calculation was performed at approximately 1frame per second (CPU), giving a total calculation time of approximately 30 minutes per treatment. Results show both cold- and hot-spots in and around the ITV, and increased dose to contralateral lung as the tumor moves in and out of the beam during treatment. Conclusion: An accurate motion model combined with a fast Monte-Carlo dose calculation allows almost real-time dose calculation of a free-breathing treatment. When combined with sagittal 2D-cine-mode MRI during treatment to update the motion model in real time, this will allow the true delivered dose of a treatment to be calculated, providing a useful tool for adaptive planning and assessing the effectiveness of gated treatments.« less

  18. Modeling Cell and Tumor-Metastasis Dosimetry with the Particle and Heavy Ion Transport Code System (PHITS) Software for Targeted Alpha-Particle Radionuclide Therapy.

    PubMed

    Lee, Dongyoul; Li, Mengshi; Bednarz, Bryan; Schultz, Michael K

    2018-06-26

    The use of targeted radionuclide therapy for cancer is on the rise. While beta-particle-emitting radionuclides have been extensively explored for targeted radionuclide therapy, alpha-particle-emitting radionuclides are emerging as effective alternatives. In this context, fundamental understanding of the interactions and dosimetry of these emitted particles with cells in the tumor microenvironment is critical to ascertaining the potential of alpha-particle-emitting radionuclides. One important parameter that can be used to assess these metrics is the S-value. In this study, we characterized several alpha-particle-emitting radionuclides (and their associated radionuclide progeny) regarding S-values in the cellular and tumor-metastasis environments. The Particle and Heavy Ion Transport code System (PHITS) was used to obtain S-values via Monte Carlo simulation for cell and tumor metastasis resulting from interactions with the alpha-particle-emitting radionuclides, lead-212 ( 212 Pb), actinium-225 ( 225 Ac) and bismuth-213 ( 213 Bi); these values were compared to the beta-particle-emitting radionuclides yttrium-90 ( 90 Y) and lutetium-177 ( 177 Lu) and an Auger-electron-emitting radionuclide indium-111 ( 111 In). The effect of cellular internalization on S-value was explored at increasing degree of internalization for each radionuclide. This aspect of S-value determination was further explored in a cell line-specific fashion for six different cancer cell lines based on the cell dimensions obtained by confocal microscopy. S-values from PHITS were in good agreement with MIRDcell S-values (cellular S-values) and the values found by Hindié et al. (tumor S-values). In the cellular model, 212 Pb and 213 Bi decay series produced S-values that were 50- to 120-fold higher than 177 Lu, while 225 Ac decay series analysis suggested S-values that were 240- to 520-fold higher than 177 Lu. S-values arising with 100% cellular internalization were two- to sixfold higher for the nucleus when compared to 0% internalization. The tumor dosimetry model defines the relative merit of radionuclides and suggests alpha particles may be effective for large tumors as well as small tumor metastases. These results from PHITS modeling substantiate emerging evidence that alpha-particle-emitting radionuclides may be an effective alternative to beta-particle-emitting radionuclides for targeted radionuclide therapy due to preferred dose-deposition profiles in the cellular and tumor metastasis context. These results further suggest that internalization of alpha-particle-emitting radionuclides via radiolabeled ligands may increase the relative biological effectiveness of radiotherapeutics.

  19. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, S.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  1. WE-DE-202-01: Connecting Nanoscale Physics to Initial DNA Damage Through Track Structure Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  2. Three-dimensional radiation transfer modeling in a dicotyledon leaf

    NASA Astrophysics Data System (ADS)

    Govaerts, Yves M.; Jacquemoud, Stéphane; Verstraete, Michel M.; Ustin, Susan L.

    1996-11-01

    The propagation of light in a typical dicotyledon leaf is investigated with a new Monte Carlo ray-tracing model. The three-dimensional internal cellular structure of the various leaf tissues, including the epidermis, the palisade parenchyma, and the spongy mesophyll, is explicitly described. Cells of different tissues are assigned appropriate morphologies and contain realistic amounts of water and chlorophyll. Each cell constituent is characterized by an index of refraction and an absorption coefficient. The objective of this study is to investigate how the internal three-dimensional structure of the tissues and the optical properties of cell constituents control the reflectance and transmittance of the leaf. Model results compare favorably with laboratory observations. The influence of the roughness of the epidermis on the reflection and absorption of light is investigated, and simulation results confirm that convex cells in the epidermis focus light on the palisade parenchyma and increase the absorption of radiation.

  3. Complex Geometric Models of Diffusion and Relaxation in Healthy and Damaged White Matter

    PubMed Central

    Farrell, Jonathan A.D.; Smith, Seth A.; Reich, Daniel S.; Calabresi, Peter A.; van Zijl, Peter C.M.

    2010-01-01

    Which aspects of tissue microstructure affect diffusion weighted MRI signals? Prior models, many of which use Monte-Carlo simulations, have focused on relatively simple models of the cellular microenvironment and have not considered important anatomic details. With the advent of higher-order analysis models for diffusion imaging, such as high-angular-resolution diffusion imaging (HARDI), more realistic models are necessary. This paper presents and evaluates the reproducibility of simulations of diffusion in complex geometries. Our framework is quantitative, does not require specialized hardware, is easily implemented with little programming experience, and is freely available as open-source software. Models may include compartments with different diffusivities, permeabilities, and T2 time constants using both parametric (e.g., spheres and cylinders) and arbitrary (e.g., mesh-based) geometries. Three-dimensional diffusion displacement-probability functions are mapped with high reproducibility, and thus can be readily used to assess reproducibility of diffusion-derived contrasts. PMID:19739233

  4. Multiscale Modeling of Virus Entry via Receptor-Mediated Endocytosis

    NASA Astrophysics Data System (ADS)

    Liu, Jin

    2012-11-01

    Virus infections are ubiquitous and remain major threats to human health worldwide. Viruses are intracellular parasites and must enter host cells to initiate infection. Receptor-mediated endocytosis is the most common entry pathway taken by viruses, the whole process is highly complex and dictated by various events, such as virus motions, membrane deformations, receptor diffusion and ligand-receptor reactions, occurring at multiple length and time scales. We develop a multiscale model for virus entry through receptor-mediated endocytosis. The binding of virus to cell surface is based on a mesoscale three dimensional stochastic adhesion model, the internalization (endocytosis) of virus and cellular membrane deformation is based on the discretization of Helfrich Hamiltonian in a curvilinear space using Monte Carlo method. The multiscale model is based on the combination of these two models. We will implement this model to study the herpes simplex virus entry into B78 cells and compare the model predictions with experimental measurements.

  5. Microscopic Spin Model for the STOCK Market with Attractor Bubbling on Regular and Small-World Lattices

    NASA Astrophysics Data System (ADS)

    Krawiecki, A.

    A multi-agent spin model for changes of prices in the stock market based on the Ising-like cellular automaton with interactions between traders randomly varying in time is investigated by means of Monte Carlo simulations. The structure of interactions has topology of a small-world network obtained from regular two-dimensional square lattices with various coordination numbers by randomly cutting and rewiring edges. Simulations of the model on regular lattices do not yield time series of logarithmic price returns with statistical properties comparable with the empirical ones. In contrast, in the case of networks with a certain degree of randomness for a wide range of parameters the time series of the logarithmic price returns exhibit intermittent bursting typical of volatility clustering. Also the tails of distributions of returns obey a power scaling law with exponents comparable to those obtained from the empirical data.

  6. DS-CDMA satellite diversity reception for personal satellite communication: Downlink performance analysis

    NASA Technical Reports Server (NTRS)

    DeGaudenzi, Riccardo; Giannetti, Filippo

    1995-01-01

    The downlink of a satellite-mobile personal communication system employing power-controlled Direct Sequence Code Division Multiple Access (DS-CDMA) and exploiting satellite-diversity is analyzed and its performance compared with a more traditional communication system utilizing single satellite reception. The analytical model developed has been thoroughly validated by means of extensive Monte Carlo computer simulations. It is shown how the capacity gain provided by diversity reception shrinks considerably in the presence of increasing traffic or in the case of light shadowing conditions. Moreover, the quantitative results tend to indicate that to combat system capacity reduction due to intra-system interference, no more than two satellites shall be active over the same region. To achieve higher system capacity, differently from terrestrial cellular systems, Multi-User Detection (MUD) techniques are likely to be required in the mobile user terminal, thus considerably increasing its complexity.

  7. Towards the prediction of essential genes by integration of network topology, cellular localization and biological process information

    PubMed Central

    2009-01-01

    Background The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes. Results We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality. Conclusion We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing essentiality. PMID:19758426

  8. Production, properties, and applications of hydrocolloid cellular solids.

    PubMed

    Nussinovitch, Amos

    2005-02-01

    Many common synthetic and edible materials are, in fact, cellular solids. When classifying the structure of cellular solids, a few variables, such as open vs. closed cells, flexible vs. brittle cell walls, cell-size distribution, cell-wall thickness, cell shape, the uniformity of the structure of the cellular solid and the different scales of length are taken into account. Compressive stress-strain relationships of most cellular solids can be easily identified according to their characteristic sigmoid shape, reflecting three deformation mechanisms: (i) elastic distortion under small strains, (ii) collapse and/or fracture of the cell walls, and (iii) densification. Various techniques are used to produce hydrocolloid (gum) cellular solids. The products of these include (i) sponges, obtained when the drying gel contains the occasionally produced gas bubbles; (ii) sponges produced by the immobilization of microorganisms; (iii) solid foams produced by drying foamed solutions or gels containing oils, and (iv) hydrocolloid sponges produced by enzymatic reactions. The porosity of the manufactured cellular solid is subject to change and depends on its composition and the processing technique. The porosity is controlled by a range of methods and the resulting surface structures can be investigated by microscopy and analyzed using fractal methods. Models used to describe stress-strain behaviors of hydrocolloid cellular solids as well as multilayered products and composites are discussed in detail in this manuscript. Hydrocolloid cellular solids have numerous purposes, simple and complex, ranging from dried texturized fruits to carriers of vitamins and other essential micronutrients. They can also be used to control the acoustic response of specific dry food products, and have a great potential for future use in countless different fields, from novel foods and packaging to medicine and medical care, daily commodities, farming and agriculture, and the environmental, chemical, and even electronic industries.

  9. Role of cellular communication in the pathways of radiation-induced biological damage

    NASA Astrophysics Data System (ADS)

    Ballarini, Francesca; Facoetti, Angelica; Mariotti, Luca; Nano, Rosanna; Ottolenghi, Andrea

    During the last decade, a large number of experimental studies on the so-called "non-targeted effects", in particular bystander effects, outlined that cellular communication plays a signifi- cant role in the pathways leading to radiation-induced biological damage. This might imply a paradigm shift in (low-dose) radiobiology, according to which one has to consider the response of groups of cells behaving like a population rather than single cells behaving as individuals. Furthermore, bystander effects, which are observed both for lethal endpoints (e.g. clonogenic inactivation and apoptosis) and for non-lethal ones (e.g. mutations and neoplastic transformation), tend to show non-linear dose responses characterized by a sharp increase followed by a plateau. This might have significant consequences in terms of low-dose risk, which is generally calculated on the basis of the "Linear No Threshold" hypothesis. Although it is known that two types of cellular communication (i.e. via gap junctions and/or molecular messengers diffusing in the extra-cellular environment, such as cytokines) play a major role, it is of utmost importance to better understand the underlying mechanisms, and how such mechanisms can be modulated by ionizing radiation. Though the "final" goal is to elucidate the in vivo scenario, in the meanwhile also in vitro studies can provide useful insights. In the present paper we will discuss key issues on the mechanisms underlying non-targeted effects and, more generally, cell communication, with focus on candidate molecular signals. Theoretical models and simulation codes can be of help in elucidating such mechanisms. In this framework, we will present a model and Monte Carlo code, under development at the University of Pavia, simulating the release, diffusion and internalization of candidate signals (typically cytokines) travelling in the extra-cellular environment, both by unirradiated (i.e., control) cells and by irradiated cells. The focus will be on the role of critical parameters such as the cell number and density, the amount of culture medium etc. Comparisons with ad hoc experimental data obtained in our laboratory will be presented, and possible implications in terms of low-dose risk assessment will be discussed. Work supported by the European Community (projects "RISC-RAD" and "NOTE") and the Italian Space Agency (project "MoMa/COUNT)

  10. Modeling of coupled differential equations for cellular chemical signaling pathways: Implications for assay protocols utilized in cellular engineering.

    PubMed

    O'Clock, George D

    2016-08-01

    Cellular engineering involves modification and control of cell properties, and requires an understanding of fundamentals and mechanisms of action for cellular derived product development. One of the keys to success in cellular engineering involves the quality and validity of results obtained from cell chemical signaling pathway assays. The accuracy of the assay data cannot be verified or assured if the effect of positive feedback, nonlinearities, and interrelationships between cell chemical signaling pathway elements are not understood, modeled, and simulated. Nonlinearities and positive feedback in the cell chemical signaling pathway can produce significant aberrations in assay data collection. Simulating the pathway can reveal potential instability problems that will affect assay results. A simulation, using an electrical analog for the coupled differential equations representing each segment of the pathway, provides an excellent tool for assay validation purposes. With this approach, voltages represent pathway enzyme concentrations and operational amplifier feedback resistance and input resistance values determine pathway gain and rate constants. The understanding provided by pathway modeling and simulation is strategically important in order to establish experimental controls for assay protocol structure, time frames specified between assays, and assay concentration variation limits; to ensure accuracy and reproducibility of results.

  11. [Features of PHITS and its application to medical physics].

    PubMed

    Hashimoto, Shintaro; Niita, Koji; Matsuda, Norihiro; Iwamoto, Yosuke; Iwase, Hiroshi; Sato, Tatsuhiko; Noda, Shusaku; Ogawa, Tatsuhiko; Nakashima, Hiroshi; Fukahori, Tokio; Furuta, Takuya; Chiba, Satoshi

    2013-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code to analyze the transport in three-dimensional phase space and collisions of nearly all particles, including heavy ions, over wide energy range up to 100 GeV/u. Various quantities, such as particle fluence and deposition energies in materials, can be deduced using estimator functions "tally". Recently, a microdosimetric tally function was also developed to apply PHITS to medical physics. Owing to these features, PHITS has been used for medical applications, such as radiation therapy and protection.

  12. Experimental approach to measure thick target neutron yields induced by heavy ions for shielding

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Brouillard, C.; Clerc, T.; Damoy, S.; Desmezières, V.; Dessay, E.; Dupuis, M.; Grinyer, G. F.; Grinyer, J.; Jacquot, B.; Ledoux, X.; Madeline, A.; Menard, N.; Michel, M.; Morel, V.; Porée, F.; Rannou, B.; Savalle, A.

    2017-09-01

    Double differential (angular and energy) neutron distributions were measured using an activation foil technique. Reactions were induced by impinging two low-energy heavy-ion beams accelerated with the GANIL CSS1 cyclotron: (36S (12 MeV/u) and 208Pb (6.25 MeV/u)) onto thick natCu targets. Results have been compared to Monte-Carlo calculations from two codes (PHITS and FLUKA) for the purpose of benchmarking radiation protection and shielding requirements. This comparison suggests a disagreement between calculations and experiment, particularly for high-energy neutrons.

  13. Tectonic evolution of Western Ishtar Terra, Venus

    NASA Astrophysics Data System (ADS)

    Marinangeli, Lucia

    1997-03-01

    A detailed geological mapping based on Magellan data has been done in Western Ishtar Terra from 300-330 deg W to 65-75 deg N. The area studied comprises three main phisiografic provinces, Atropos Tessera, Akna Montes and North-Western Lakshmi Planum. The purposes of this study are (1) to recognize the tectonism of this area and investigate its type, direction, intensity, distribution and age relationships, (2) to define the link between the formation of the Akna mountain belt and the tectonic deformation in adjacent Tessera and Lakshmi Planum.

  14. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  15. The X-43A Six Degree of Freedom Monte Carlo Analysis

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger

    2008-01-01

    This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A inflight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.

  16. The X-43A Six Degree of Freedom Monte Carlo Analysis

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger; Richard, Michael

    2007-01-01

    This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A in-flight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.

  17. Architecture, religion, and tuberculosis in Sainte-Agathe-des-Monts, Quebec.

    PubMed

    Adams, Annmarie; Poutanen, Mary Anne

    2009-01-01

    This paper explores the architecture of the Mount Sinai Sanatorium in Sainte-Agathe-des-Monts (Qc) to disentangle the role of religion in the treatment of tuberculosis. In particular, we analyze the design of Mount Sinai, the jewel in the crown of Jewish philanthropy in Montreal, in relation to that of the nearby Laurentian Sanatorium. While Mount Sinai offered free treatment to the poor in a stunning, Art Deco building of 1930, the Protestant hospital had by then served paying patients for more than two decades in a purposefully home-like, Tudor-revival setting. Using architectural historian Bernard Herman's concept of embedded landscapes, we show how the two hospitals differed in terms of their relationship to site, access, and, most importantly, to city, knowledge, and community. Architects Scopes & Feustmann, who designed the Laurentian hospital, operated an office at Saranac Lake, New York, America's premier destination for consumptives. The qualifications of Mount Sinai architects Spence & Goodman, however, derived from their experience with Jewish institutions in Montreal. Following Herman's approach to architecture through movement and context, how did notions of medical therapy and Judaism intersect in the plans of Mount Sinai?

  18. Commissioning and initial acceptance tests for a commercial convolution dose calculation algorithm for radiotherapy treatment planning in comparison with Monte Carlo simulation and measurement

    PubMed Central

    Moradi, Farhad; Mahdavi, Seyed Rabi; Mostaar, Ahmad; Motamedi, Mohsen

    2012-01-01

    In this study the commissioning of a dose calculation algorithm in a currently used treatment planning system was performed and the calculation accuracy of two available methods in the treatment planning system i.e., collapsed cone convolution (CCC) and equivalent tissue air ratio (ETAR) was verified in tissue heterogeneities. For this purpose an inhomogeneous phantom (IMRT thorax phantom) was used and dose curves obtained by the TPS (treatment planning system) were compared with experimental measurements and Monte Carlo (MCNP code) simulation. Dose measurements were performed by using EDR2 radiographic films within the phantom. Dose difference (DD) between experimental results and two calculation methods was obtained. Results indicate maximum difference of 12% in the lung and 3% in the bone tissue of the phantom between two methods and the CCC algorithm shows more accurate depth dose curves in tissue heterogeneities. Simulation results show the accurate dose estimation by MCNP4C in soft tissue region of the phantom and also better results than ETAR method in bone and lung tissues. PMID:22973081

  19. Monte Carlo calculation for the development of a BNCT neutron source (1eV-10KeV) using MCNP code.

    PubMed

    El Moussaoui, F; El Bardouni, T; Azahra, M; Kamili, A; Boukhal, H

    2008-09-01

    Different materials have been studied in order to produce the epithermal neutron beam between 1eV and 10KeV, which are extensively used to irradiate patients with brain tumors such as GBM. For this purpose, we have studied three different neutrons moderators (H(2)O, D(2)O and BeO) and their combinations, four reflectors (Al(2)O(3), C, Bi, and Pb) and two filters (Cd and Bi). Results of calculation showed that the best obtained assembly configuration corresponds to the combination of the three moderators H(2)O, BeO and D(2)O jointly to Al(2)O(3) reflector and two filter Cd+Bi optimize the spectrum of the epithermal neutron at 72%, and minimize the thermal neutron to 4% and thus it can be used to treat the deep tumor brain. The calculations have been performed by means of the Monte Carlo N (particle code MCNP 5C). Our results strongly encourage further studying of irradiation of the head with epithermal neutron fields.

  20. Building proteins from C alpha coordinates using the dihedral probability grid Monte Carlo method.

    PubMed Central

    Mathiowetz, A. M.; Goddard, W. A.

    1995-01-01

    Dihedral probability grid Monte Carlo (DPG-MC) is a general-purpose method of conformational sampling that can be applied to many problems in peptide and protein modeling. Here we present the DPG-MC method and apply it to predicting complete protein structures from C alpha coordinates. This is useful in such endeavors as homology modeling, protein structure prediction from lattice simulations, or fitting protein structures to X-ray crystallographic data. It also serves as an example of how DPG-MC can be applied to systems with geometric constraints. The conformational propensities for individual residues are used to guide conformational searches as the protein is built from the amino-terminus to the carboxyl-terminus. Results for a number of proteins show that both the backbone and side chain can be accurately modeled using DPG-MC. Backbone atoms are generally predicted with RMS errors of about 0.5 A (compared to X-ray crystal structure coordinates) and all atoms are predicted to an RMS error of 1.7 A or better. PMID:7549885

  1. Experimental validation of a direct simulation by Monte Carlo molecular gas flow model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shufflebotham, P.K.; Bartel, T.J.; Berney, B.

    1995-07-01

    The Sandia direct simulation Monte Carlo (DSMC) molecular/transition gas flow simulation code has significant potential as a computer-aided design tool for the design of vacuum systems in low pressure plasma processing equipment. The purpose of this work was to verify the accuracy of this code through direct comparison to experiment. To test the DSMC model, a fully instrumented, axisymmetric vacuum test cell was constructed, and spatially resolved pressure measurements made in N{sub 2} at flows from 50 to 500 sccm. In a ``blind`` test, the DSMC code was used to model the experimental conditions directly, and the results compared tomore » the measurements. It was found that the model predicted all the experimental findings to a high degree of accuracy. Only one modeling issue was uncovered. The axisymmetric model showed localized low pressure spots along the axis next to surfaces. Although this artifact did not significantly alter the accuracy of the results, it did add noise to the axial data. {copyright} {ital 1995} {ital American} {ital Vacuum} {ital Society}« less

  2. Particle in cell/Monte Carlo collision analysis of the problem of identification of impurities in the gas by the plasma electron spectroscopy method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kusoglu Sarikaya, C.; Rafatov, I., E-mail: rafatov@metu.edu.tr; Kudryavtsev, A. A.

    2016-06-15

    The work deals with the Particle in Cell/Monte Carlo Collision (PIC/MCC) analysis of the problem of detection and identification of impurities in the nonlocal plasma of gas discharge using the Plasma Electron Spectroscopy (PLES) method. For this purpose, 1d3v PIC/MCC code for numerical simulation of glow discharge with nonlocal electron energy distribution function is developed. The elastic, excitation, and ionization collisions between electron-neutral pairs and isotropic scattering and charge exchange collisions between ion-neutral pairs and Penning ionizations are taken into account. Applicability of the numerical code is verified under the Radio-Frequency capacitively coupled discharge conditions. The efficiency of the codemore » is increased by its parallelization using Open Message Passing Interface. As a demonstration of the PLES method, parallel PIC/MCC code is applied to the direct current glow discharge in helium doped with a small amount of argon. Numerical results are consistent with the theoretical analysis of formation of nonlocal EEDF and existing experimental data.« less

  3. Monte Carlo simulation of photon buildup factors for shielding materials in diagnostic x-ray facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharrati, Hedi; Agrebi, Amel; Karoui, Mohamed Karim

    2012-10-15

    Purpose: A simulation of buildup factors for ordinary concrete, steel, lead, plate glass, lead glass, and gypsum wallboard in broad beam geometry for photons energies from 10 keV to 150 keV at 5 keV intervals is presented. Methods: Monte Carlo N-particle radiation transport computer code has been used to determine the buildup factors for the studied shielding materials. Results: An example concretizing the use of the obtained buildup factors data in computing the broad beam transmission for tube potentials at 70, 100, 120, and 140 kVp is given. The half value layer, the tenth value layer, and the equilibrium tenthmore » value layer are calculated from the broad beam transmission for these tube potentials. Conclusions: The obtained values compared with those calculated from the published data show the ability of these data to predict shielding transmission curves. Therefore, the buildup factors data can be combined with primary, scatter, and leakage x-ray spectra to provide a computationally based solution to broad beam transmission for barriers in shielding x-ray facilities.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koren, S; Bragilovski, D; Tafo, A Guemnie

    Purpose: To evaluate the clinical feasibility of IntraBeam intra operative kV irradiation beam device for ocular conjunctiva treatments. The Intra-Beam system offers a 4.4 mm diameter needle applicator, that is not suitable for treatment of a large surface with limits access. We propose an adaptor that will answer to this clinical need and provide initial dosimetry. Methods: The dose distribution of the needle applicator is non uniform and hence not suitable for treatment of relatively large surfaces. We designed an adapter to the needle applicator that will filter the X-rays and produce a conformal dose distribution over the treatment areamore » while shielding surfaces to be spared. Dose distributions were simulated using FLUKA is a fully integrated particle physics Monte Carlo simulation package. Results: We designed a wedge applicator made of Polythermide window and stainless steel for collimating. We compare the dose distribution to that of the known needle and surface applicators. Conclusion: Initial dosimetry shows feasibility of this approach. While further refinements to the design may be warranted, the results support construction of a prototype and confirmation of the Monte Carlo dosimetry with measured data.« less

  5. MCNP Version 6.2 Release Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.

    Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less

  6. Determination of the structural properties of the aqueous electrolyte LiCl6H 2 O at the supercooled state using the Reverse Monte Carlo (RMC) simulation

    NASA Astrophysics Data System (ADS)

    ZIANE, M.; HABCHI, M.; DEROUICHE, A.; MESLI, S. M.; BENZOUINE, F.; KOTBI, M.

    2017-03-01

    A structural study of an aqueous electrolyte whose experimental results are available. It is a solution of A structural study of an aqueous electrolyte whose experimental results are available. It is a solution LiCl6H 2 O type at supercooled state (162K) contrasted with pure water at room temperature by means of Partial Distribution Functions (PDF) issue from neutron scattering technique. The aqueous electrolyte solution of the chloride lithium LiCl presents interesting properties which is studied by different methods at different concentration and thermodynamical states: This system possesses the property to become a glass through a metastable supercooled state when the temperature decreases. Based on these partial functions, the Reverse Monte Carlo method (RMC) computes radial correlation functions which allow exploring a number of structural features of the system. The purpose of the RMC is to produce a consistent configuration with the experimental data. They are usually the most important in the limit of systematic errors (of unknown distribution).

  7. Dynamics of relaxation to a stationary state for interacting molecular motors

    NASA Astrophysics Data System (ADS)

    Gomes, Luiza V. F.; Kolomeisky, Anatoly B.

    2018-01-01

    Motor proteins are active enzymatic molecules that drive a variety of biological processes, including transfer of genetic information, cellular transport, cell motility and muscle contraction. It is known that these biological molecular motors usually perform their cellular tasks by acting collectively, and there are interactions between individual motors that specify the overall collective behavior. One of the fundamental issues related to the collective dynamics of motor proteins is the question if they function at stationary-state conditions. To investigate this problem, we analyze a relaxation to the stationary state for the system of interacting molecular motors. Our approach utilizes a recently developed theoretical framework, which views the collective dynamics of motor proteins as a totally asymmetric simple exclusion process of interacting particles, where interactions are taken into account via a thermodynamically consistent approach. The dynamics of relaxation to the stationary state is analyzed using a domain-wall method that relies on a mean-field description, which takes into account some correlations. It is found that the system quickly relaxes for repulsive interactions, while attractive interactions always slow down reaching the stationary state. It is also predicted that for some range of parameters the fastest relaxation might be achieved for a weak repulsive interaction. Our theoretical predictions are tested with Monte Carlo computer simulations. The implications of our findings for biological systems are briefly discussed.

  8. A model describing diffusion in prostate cancer.

    PubMed

    Gilani, Nima; Malcolm, Paul; Johnson, Glyn

    2017-07-01

    Quantitative diffusion MRI has frequently been studied as a means of grading prostate cancer. Interpretation of results is complicated by the nature of prostate tissue, which consists of four distinct compartments: vascular, ductal lumen, epithelium, and stroma. Current diffusion measurements are an ill-defined weighted average of these compartments. In this study, prostate diffusion is analyzed in terms of a model that takes explicit account of tissue compartmentalization, exchange effects, and the non-Gaussian behavior of tissue diffusion. The model assumes that exchange between the cellular (ie, stromal plus epithelial) and the vascular and ductal compartments is slow. Ductal and cellular diffusion characteristics are estimated by Monte Carlo simulation and a two-compartment exchange model, respectively. Vascular pseudodiffusion is represented by an additional signal at b = 0. Most model parameters are obtained either from published data or by comparing model predictions with the published results from 41 studies. Model prediction error is estimated using 10-fold cross-validation. Agreement between model predictions and published results is good. The model satisfactorily explains the variability of ADC estimates found in the literature. A reliable model that predicts the diffusion behavior of benign and cancerous prostate tissue of different Gleason scores has been developed. Magn Reson Med 78:316-326, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  9. [Risk-oriented model of the control of the level of electric magnetic fields of base stations of cellular communications].

    PubMed

    Lutsenko, L A; Tulakin, A V; Egorova, A M; Mikhailova, O M; Gvozdeva, L L; Chigryay, E K

    The purpose of this study was to give the description of harmful effects of the impact of electromagnetic radiations from base stations of cellular communication as the most common sources of radio frequencies of electromagnetic fields in the environment. The highest values of the energy flux density were measured on the roofs of houses where antennas are installed - more than 10 pW/cm. The lowest values were recorded in inside premises with expositions of 0.1-1 pW/cm. In the close location of the railway station to the base stations of the cellular communication there was seen a cumulative effect. There are proposed both new safe hygienic approaches to the control for the safety of the work of base station and protective measures.

  10. GNSS receiver use-case development GPS-ABC workshop VI RTCA Washington, DC March 30, 2017.

    DOT National Transportation Integrated Search

    2017-03-30

    The purpose of this workshop was to discuss the results from testing of various categories of GPS/Global Navigation Satellite System (GNSS) receivers to include aviation (non-certified), cellular, general location/navigation, high precision and netwo...

  11. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the geant4 Monte Carlo code

    PubMed Central

    Guan, Fada; Peeler, Christopher; Bronk, Lawrence; Geng, Changran; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Grosshans, David; Mohan, Radhe; Titt, Uwe

    2015-01-01

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the geant 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from geant 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using geant 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in geant 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the tracking step in geant 4. The incorrect LETd values lead to substantial differences in the calculated RBE. Conclusions: When the geant 4 particle tracking method is used to calculate the average LET values within targets with a small step limit, such as smaller than 500 μm, the authors recommend the use of LETt in the dose plateau region and LETd around the Bragg peak. For a large step limit, i.e., 500 μm, LETd is recommended along the whole Bragg curve. The transition point depends on beam parameters and can be found by determining the location where the gradient of the ratio of LETd and LETt becomes positive. PMID:26520716

  12. TU-EF-304-07: Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; UT Southwestern Medical Center, Dallas, TX; Tian, Z

    2015-06-15

    Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC intomore » IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical usages.« less

  13. A single-center retrospective clinicopathologic study of endomyocardial biopsies after heart transplant at Baskent University Hospital in Ankara, 1993-2014.

    PubMed

    Terzi, Ayşen; Sezgin, Atilla; Tunca, Zeynep; Deniz, Ebru; Ayva, Ebru Şebnem; Haberal Reyhan, Nihan; Müderrisoğlu, Haldun; Özdemir, Binnaz Handan

    2015-04-01

    The purpose of this study was to investigate the frequency and prognostic importance of acute cellular rejection after heart transplant. All 84 heart transplant patients at our center from January 1993 to January 2014, including all 576 endomyocardial biopsies, were evaluated with retrospective review of clinical records and endomyocardial biopsies. Routine and clinically indicated endomyocardial biopsies after heart transplant were graded for acute cellular rejection (2005 International Society for Heart and Lung Transplantation Working Formulation). Survival analysis was performed using Kaplan-Meier method. There were 61 male (73%) and 23 female recipients. Median age at heart transplant was 29 years (range, 1-62 y). Posttransplant early mortality rate was 17.9% (15 patients). In the other 69 patients, 23 patients died and 46 patients (66.7%) were alive at mean 69.3 ± 7.2 months after heart transplant. Mean follow-up was 35.4 ± 29.8 months (range, 0.07-117.5 mo). Mean 8.4 ± 4.2 endomyocardial biopsies (range, 1-19 biopsies) were performed per patient. Median first biopsy time was 7 days (range, 1-78 d). The frequency of posttransplant acute cellular rejection was 63.8% (44 of 69 patients) by histopathology; 86% patients experienced the first episode of acute cellular rejection within 6 months after transplant. There were 18 patients with acute cellular rejection ≥ grade 2R on ≥ 1 endomyocardial biopsy in 44 patients with acute cellular rejection. No significant difference was observed between survival rates of patients with grade 1R or ≥ grade 2R acute cellular rejection, or between survival rates of patients with or without diagnosis of any grade of acute cellular rejection. Acute cellular rejection was not related to any prognostic risk factor. Acute cellular rejection had no negative effect on heart recipient long-term survival, but it was a frequent complication after heart transplant, especially within the first 6 months.

  14. Assessment of Molecular Modeling & Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materialsmore » modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.« less

  15. Proteomic Profiling of Rat Thyroarytenoid Muscle

    ERIC Educational Resources Information Center

    Welham, Nathan V.; Marriott, Gerard; Bless, Diane M.

    2006-01-01

    Purpose: Proteomic methodologies offer promise in elucidating the systemwide cellular and molecular processes that characterize normal and diseased thyroarytenoid (TA) muscle. This study examined methodological issues central to the application of 2-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis (2D SDS-PAGE) to the study of…

  16. Brain Magnetic Resonance Spectroscopy in Tourette's Disorder

    ERIC Educational Resources Information Center

    DeVito, Timothy J.; Drost, Dick J.; Pavlosky, William; Neufeld, Richard W.J.; Rajakumar, Nagalingam; McKinlay, B. Duncan; Williamson, Peter C.; Nicolson, Rob

    2005-01-01

    Objective: Although abnormalities of neural circuits involving the cortex, striatum, and thalamus are hypothesized to underlie Tourette's disorder, the neuronal abnormalities within components of these circuits are unknown. The purpose of this study was to examine the cellular neurochemistry within these circuits in Tourette's disorder using…

  17. A trans-dimensional Bayesian Markov chain Monte Carlo algorithm for model assessment using frequency-domain electromagnetic data

    USGS Publications Warehouse

    Minsley, B.J.

    2011-01-01

    A meaningful interpretation of geophysical measurements requires an assessment of the space of models that are consistent with the data, rather than just a single, 'best' model which does not convey information about parameter uncertainty. For this purpose, a trans-dimensional Bayesian Markov chain Monte Carlo (MCMC) algorithm is developed for assessing frequency-domain electromagnetic (FDEM) data acquired from airborne or ground-based systems. By sampling the distribution of models that are consistent with measured data and any prior knowledge, valuable inferences can be made about parameter values such as the likely depth to an interface, the distribution of possible resistivity values as a function of depth and non-unique relationships between parameters. The trans-dimensional aspect of the algorithm allows the number of layers to be a free parameter that is controlled by the data, where models with fewer layers are inherently favoured, which provides a natural measure of parsimony and a significant degree of flexibility in parametrization. The MCMC algorithm is used with synthetic examples to illustrate how the distribution of acceptable models is affected by the choice of prior information, the system geometry and configuration and the uncertainty in the measured system elevation. An airborne FDEM data set that was acquired for the purpose of hydrogeological characterization is also studied. The results compare favourably with traditional least-squares analysis, borehole resistivity and lithology logs from the site, and also provide new information about parameter uncertainty necessary for model assessment. ?? 2011. Geophysical Journal International ?? 2011 RAS.

  18. Role of vanguard counter-potential in terahertz emission due to surface currents explicated by three-dimensional ensemble Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Cortie, D. L.; Lewis, R. A.

    2011-10-01

    The discovery that short pulses of near-infrared radiation striking a semiconductor may lead to emission of radiation at terahertz frequencies paved the way for terahertz time-domain spectroscopy. Previous modeling has allowed the physical mechanisms to be understood in general terms but it has not fully explored the role of key physical parameters of the emitter material nor has it fully revealed the competing nature of the surface-field and photo-Dember effects. In this context, our purpose has been to more fully explicate the mechanisms of terahertz emission from transient currents at semiconductor surfaces and to determine the criteria for efficient emission. To achieve this purpose we employ an ensemble Monte Carlo simulation in three dimensions. To ground the calculations, we focus on a specific emitter, InAs. We separately vary distinct physical parameters to determine their specific contribution. We find that scattering as a whole has relatively little impact on the terahertz emission. The emission is found to be remarkably resistant to alterations of the dark surface potential. Decreasing the band gap leads to a strong increase in terahertz emission, as does decreasing the electron mass. Increasing the absorption dramatically influences the peak-peak intensity and peak shape. We conclude that increasing absorption is the most direct path to improve surface-current semiconductor terahertz emitters. We find for longer pump pulses that the emission is limited by a newly identified vanguard counter-potential mechanism: Electrons at the leading edge of longer laser pulses repel subsequent electrons. This discovery is the main result of our work.

  19. [Comparison of Organ Dose Calculation Using Monte Carlo Simulation and In-phantom Dosimetry in CT Examination].

    PubMed

    Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio

    Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, the method and the relative differences between organ dose simulation and measurement is unclear. The purpose of this study was to compare organ doses evaluated by Monte Carlo simulation with doses evaluated by in-phantom dosimetry. The simulation software Radimetrics (Bayer) was used for the calculation of organ dose. Measurement was performed with radio-photoluminescence glass dosimeter (RPLD) set at various organ positions within RANDO phantom. To evaluate difference of CT scanner, two different CT scanners were used in this study. Angular dependence of RPLD and measurement of effective energy were performed for each scanner. The comparison of simulation and measurement was evaluated by relative differences. In the results, angular dependence of RPLD at two scanners was 31.6±0.45 mGy for SOMATOM Definition Flash and 29.2±0.18 mGy for LightSpeed VCT. The organ dose was 42.2 mGy (range, 29.9-52.7 mGy) by measurements and 37.7 mGy (range, 27.9-48.1 mGy) by simulations. The relative differences of organ dose between measurement and simulation were 13%, excluding of breast's 42%. We found that organ dose by simulation was lower than by measurement. In conclusion, the results of relative differences will be useful for evaluating organ doses for individual patients by simulation software Radimetrics.

  20. An investigation of accelerator head scatter and output factor in air.

    PubMed

    Ding, George X

    2004-09-01

    Our purpose in this study was to investigate whether the Monte Carlo simulation can accurately predict output factors in air. Secondary goals were to study the head scatter components and investigate the collimator exchange effect. The Monte Carlo code, BEAMnrc, was used in the study. Photon beams of 6 and 18 MV were from a Varian Clinac 2100EX accelerator and the measurements were performed using an ionization chamber in a mini-phantom. The Monte Carlo calculated in air output factors was within 1% of measured values. The simulation provided information of the origin and the magnitude of the collimator exchange effect. It was shown that the collimator backscatter to the beam monitor chamber played a significant role in the beam output factors. However the magnitude of the scattered dose contributions from the collimator at the isocenter is negligible. The maximum scattered dose contribution from the collimators was about 0.15% and 0.4% of the total dose at the isocenter for a 6 and 18 MV beam, respectively. The scattered dose contributions from the flattening filter at the isocenter were about 0.9-3% and 0.2-6% of the total dose for field sizes of 4x4 cm2-40x40 cm2 for the 6 and 18 MV beam, respectively. The study suggests that measurements of head scatter factors be done at large depth well beyond the depth of electron contamination. The insight information may have some implications for developing generalized empirical models to calculate the head scatter.

  1. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    PubMed

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  2. SU-F-T-507: Modeling Cerenkov Emissions From Medical Linear Accelerators: A Monte Carlo Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrock, Z; Oldham, M; Adamson, J

    2016-06-15

    Purpose: Cerenkov emissions are a natural byproduct of MV radiotherapy but are typically ignored as inconsequential. However, Cerenkov photons may be useful for activation of drugs such as psoralen. Here, we investigate Cerenkov radiation from common radiotherapy beams using Monte Carlo simulations. Methods: GAMOS, a GEANT4-based framework for Monte Carlo simulations, was used to model 6 and 18MV photon beams from a Varian medical linac. Simulations were run to track Cerenkov production from these beams when irradiating a 50cm radius sphere of water. Electron contamination was neglected. 2 million primary photon histories were run for each energy, and values scoredmore » included integral dose and total track length of Cerenkov photons between 100 and 400 nm wavelength. By lowering process energy thresholds, simulations included low energy Bremsstrahlung photons to ensure comprehensive evaluation of UV production in the medium. Results: For the same number of primary photons, UV Cerenkov production for 18MV was greater than 6MV by a factor of 3.72 as determined by total track length. The total integral dose was a factor of 2.31 greater for the 18MV beam. Bremsstrahlung photons were a negligibly small component of photons in the wavelength range of interest, comprising 0.02% of such photons. Conclusion: Cerenkov emissions in water are 1.6x greater for 18MV than 6MV for the same integral dose. Future work will expand the analysis to include optical properties of tissues, and to investigate strategies to maximize Cerenkov emission per unit dose for MV radiotherapy.« less

  3. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  4. Comparative dose levels between CT-scanner and slot-scanning device (EOS system) in pregnant women pelvimetry.

    PubMed

    Ben Abdennebi, A; Aubry, S; Ounalli, L; Fayache, M S; Delabrousse, E; Petegnief, Y

    2017-01-01

    To estimate fetal absorbed doses for pregnant women pelvimetry, a comparative study between EOS imaging system and low-dose spiral CT-scanner was carried out. For this purpose three different studies were investigated: in vivo, in vitro and Monte Carlo calculations. In vivo dosimetry was performed, using OSL NanoDot dosimeters, to determine the dose to the skin of twenty pregnant women. In vitro studies were established by using a cubic phantom of water, in order to estimate the out of field doses. In the latter study, OSLDs were placed at depths corresponding to the lowest, average and highest position of the uterus. Monte Carlo calculations of effective doses to high radio-sensitive organs were established, using PCXMC and CTExpo software suites for EOS imaging system and CT-scanner, respectively. The EOS imaging system reduces radiation exposure 4 to 8 times compared to the CT-scanner. The entrance skin doses were 74% (p-values <0.01) higher with the CT-scanner than with the EOS system. In the out of field region, the measured doses of the EOS system were reduced by 80% (p-values <0.02). Monte Carlo calculations confirmed that effective doses to organs are less accentuated for EOS than for CT pelvimetry. The EOS system is less irradiating than the CT exam. The out-of-field dose which is significant, is lower in the EOS than in the CT-scanner and could be reduced even further by optimizing the time used for image acquisition. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Angular dependence of the nanoDot OSL dosimeter.

    PubMed

    Kerns, James R; Kry, Stephen F; Sahoo, Narayan; Followill, David S; Ibbott, Geoffrey S

    2011-07-01

    Optically stimulated luminescent detectors (OSLDs) are quickly gaining popularity as passive dosimeters, with applications in medicine for linac output calibration verification, brachytherapy source verification, treatment plan quality assurance, and clinical dose measurements. With such wide applications, these dosimeters must be characterized for numerous factors affecting their response. The most abundant commercial OSLD is the InLight/OSL system from Landauer, Inc. The purpose of this study was to examine the angular dependence of the nanoDot dosimeter, which is part of the InLight system. Relative dosimeter response data were taken at several angles in 6 and 18 MV photon beams, as well as a clinical proton beam. These measurements were done within a phantom at a depth beyond the build-up region. To verify the observed angular dependence, additional measurements were conducted as well as Monte Carlo simulations in MCNPX. When irradiated with the incident photon beams parallel to the plane of the dosimeter, the nanoDot response was 4% lower at 6 MV and 3% lower at 18 MV than the response when irradiated with the incident beam normal to the plane of the dosimeter. Monte Carlo simulations at 6 MV showed similar results to the experimental values. Examination of the results in Monte Carlo suggests the cause as partial volume irradiation. In a clinical proton beam, no angular dependence was found. A nontrivial angular response of this OSLD was observed in photon beams. This factor may need to be accounted for when evaluating doses from photon beams incident from a variety of directions.

  6. Monte Carlo calculated TG-60 dosimetry parameters for the {beta}{sup -} emitter {sup 153}Sm brachytherapy source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadeghi, Mahdi; Taghdiri, Fatemeh; Hamed Hosseini, S.

    Purpose: The formalism recommended by Task Group 60 (TG-60) of the American Association of Physicists in Medicine (AAPM) is applicable for {beta} sources. Radioactive biocompatible and biodegradable {sup 153}Sm glass seed without encapsulation is a {beta}{sup -} emitter radionuclide with a short half-life and delivers a high dose rate to the tumor in the millimeter range. This study presents the results of Monte Carlo calculations of the dosimetric parameters for the {sup 153}Sm brachytherapy source. Methods: Version 5 of the (MCNP) Monte Carlo radiation transport code was used to calculate two-dimensional dose distributions around the source. The dosimetric parameters ofmore » AAPM TG-60 recommendations including the reference dose rate, the radial dose function, the anisotropy function, and the one-dimensional anisotropy function were obtained. Results: The dose rate value at the reference point was estimated to be 9.21{+-}0.6 cGy h{sup -1} {mu}Ci{sup -1}. Due to the low energy beta emitted from {sup 153}Sm sources, the dose fall-off profile is sharper than the other beta emitter sources. The calculated dosimetric parameters in this study are compared to several beta and photon emitting seeds. Conclusions: The results show the advantage of the {sup 153}Sm source in comparison with the other sources because of the rapid dose fall-off of beta ray and high dose rate at the short distances of the seed. The results would be helpful in the development of the radioactive implants using {sup 153}Sm seeds for the brachytherapy treatment.« less

  7. The Color and Surface Composition of Mountains on Pluto

    NASA Astrophysics Data System (ADS)

    Olkin, Catherine B.; Reuter, D. C.; Stern, S. Alan; Young, Leslie; Weaver, Harold A.; Ennico, Kimberly; Binzel, Richard; Cook, Jason C.; Cruikshank, Dale P.; Dalle Ore, Cristina M.; Earle, Alissa M.; Grundy, W. M.; Howett, Carly; Parker, Alex; Protopapa, Silvia; Schmitt, Bernard; Singer, Kelsi N.; Spencer, John R.; Stansberry, John A.; Philippe, Sylvain; New Horizons Science Team

    2016-10-01

    The New Horizons mission revealed that there are mountains along the western edge of the large glacier that dominates Pluto's anti-Charon hemisphere. This talk will focus on the color and surface composition of the four large mountainous regions named Al Idrisi Montes, Bare Montes, Hillary Montes and Norgay Montes (all feature names are informal).The Al Idrisi Montes are large blocks up to 40 km across and 5 km high that appear to be broken off of the ice crust and transported into Sputnik Planum (Moore et al. 2016). The color of this region as a function of latitude will be presented as well as the color differences between the blocks and the interstitial material between the blocks. Moving south along the edge of Sputnik Planum, the next mountainous region is Bare Montes. Part of the Bare Montes resembles Al Idrisi Montes with its chaotic blocky structure, but there is a significant difference in color between these regions. The Bare Montes are more red than Al Idrisi Montes and this region's color more closely matches the nearby terrain of Cthulhu Regio. Continuing south, to the Hillary and Norgay Montes regions these topographic features become less red with both red and neutral colors on their slopes. The Hillary Montes show both red and neutral colors in the ices surrounding the peaks.This work will provide a quantitative comparison of the color and composition across these 4 mountainous regions using data from the Ralph instrument. Ralph has 4 color filters: blue (400-550 nm), red (540-700 nm), near IR (780-975) and methane filter (860-910 nm) and collects infrared imaging spectrometric data (from 1.25-2.5 microns).This work was supported by NASA's New Horizons project.

  8. Mountain Infantry - Is There a Need?

    DTIC Science & Technology

    1988-06-03

    Valley and was used extensively by the Germans. The 9objective of the mountaineers was the Monte Belvedere- Monte Della Torraccia Ridge network which...hopes of attaining surprise. The 85th conducted a frontal attack against Monte Belvedere and Monte Gorgolesco, while the 87th attacked up the western...fought a bloody battle the last 300 yards from the summit of both mountains. Monte Della Torraccia proved to be a tough fight. On the afternoon of the

  9. A comparative cellular and molecular biology of longevity database.

    PubMed

    Stuart, Jeffrey A; Liang, Ping; Luo, Xuemei; Page, Melissa M; Gallagher, Emily J; Christoff, Casey A; Robb, Ellen L

    2013-10-01

    Discovering key cellular and molecular traits that promote longevity is a major goal of aging and longevity research. One experimental strategy is to determine which traits have been selected during the evolution of longevity in naturally long-lived animal species. This comparative approach has been applied to lifespan research for nearly four decades, yielding hundreds of datasets describing aspects of cell and molecular biology hypothesized to relate to animal longevity. Here, we introduce a Comparative Cellular and Molecular Biology of Longevity Database, available at ( http://genomics.brocku.ca/ccmbl/ ), as a compendium of comparative cell and molecular data presented in the context of longevity. This open access database will facilitate the meta-analysis of amalgamated datasets using standardized maximum lifespan (MLSP) data (from AnAge). The first edition contains over 800 data records describing experimental measurements of cellular stress resistance, reactive oxygen species metabolism, membrane composition, protein homeostasis, and genome homeostasis as they relate to vertebrate species MLSP. The purpose of this review is to introduce the database and briefly demonstrate its use in the meta-analysis of combined datasets.

  10. Effect of light and heat on the stability of montelukast in solution and in its solid state.

    PubMed

    Al Omari, Mahmoud M; Zoubi, Rufaida M; Hasan, Enas I; Khader, Tariq Z; Badwan, Adnan A

    2007-11-05

    The chemical stability of montelukast (Monte) in solution and in its solid state was studied. A simultaneous measurement of Monte and its degradation products was determined using a selective HPLC method. The HPLC system comprised a reversed phase column (C18) as the stationary phase and a mixture of ammonium acetate buffer of pH 3.5 and methanol (15:85 v/v) as the mobile phase. The UV detection was conducted at 254 nm. Monte in solution showed instability when exposed to light leading to the formation of its cis-isomer as the major photoproduct. The rate of photodegradation of Monte in solution exposed to various light sources increases in the order of; sodium

  11. Cellular stress induces a protective sleep-like state in C. elegans.

    PubMed

    Hill, Andrew J; Mansfield, Richard; Lopez, Jessie M N G; Raizen, David M; Van Buskirk, Cheryl

    2014-10-20

    Sleep is recognized to be ancient in origin, with vertebrates and invertebrates experiencing behaviorally quiescent states that are regulated by conserved genetic mechanisms. Despite its conservation throughout phylogeny, the function of sleep remains debated. Hypotheses for the purpose of sleep include nervous-system-specific functions such as modulation of synaptic strength and clearance of metabolites from the brain, as well as more generalized cellular functions such as energy conservation and macromolecule biosynthesis. These models are supported by the identification of synaptic and metabolic processes that are perturbed during prolonged wakefulness. It remains to be seen whether perturbations of cellular homeostasis in turn drive sleep. Here we show that under conditions of cellular stress, including noxious heat, cold, hypertonicity, and tissue damage, the nematode Caenorhabditis elegans engages a behavioral quiescence program. The stress-induced quiescent state displays properties of sleep and is dependent on the ALA neuron, which mediates the conserved soporific effect of epidermal growth factor (EGF) ligand overexpression. We characterize heat-induced quiescence in detail and show that it is indeed dependent on components of EGF signaling, providing physiological relevance to the behavioral effects of EGF family ligands. We find that after noxious heat exposure, quiescence-defective animals show elevated expression of cellular stress reporter genes and are impaired for survival, demonstrating the benefit of stress-induced behavioral quiescence. These data provide evidence that cellular stress can induce a protective sleep-like state in C. elegans and suggest that a deeply conserved function of sleep is to mitigate disruptions of cellular homeostasis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. CAM: A high-performance cellular-automaton machine

    NASA Astrophysics Data System (ADS)

    Toffoli, Tommaso

    1984-01-01

    CAM is a high-performance machine dedicated to the simulation of cellular automata and other distributed dynamical systems. Its speed is about one-thousand times greater than that of a general-purpose computer programmed to do the same task; in practical terms, this means that CAM can show the evolution of cellular automata on a color monitor with an update rate, dynamic range, and spatial resolution comparable to those of a Super-8 movie, thus permitting intensive interactive experimentation. Machines of this kind can open up novel fields of research, and in this context it is important that results be easy to obtain, reproduce, and transmit. For these reasons, in designing CAM it was important to achieve functional simplicity, high flexibility, and moderate production cost. We expect that many research groups will be able to own their own copy of the machine to do research with.

  13. Genetic networks and soft computing.

    PubMed

    Mitra, Sushmita; Das, Ranajit; Hayashi, Yoichi

    2011-01-01

    The analysis of gene regulatory networks provides enormous information on various fundamental cellular processes involving growth, development, hormone secretion, and cellular communication. Their extraction from available gene expression profiles is a challenging problem. Such reverse engineering of genetic networks offers insight into cellular activity toward prediction of adverse effects of new drugs or possible identification of new drug targets. Tasks such as classification, clustering, and feature selection enable efficient mining of knowledge about gene interactions in the form of networks. It is known that biological data is prone to different kinds of noise and ambiguity. Soft computing tools, such as fuzzy sets, evolutionary strategies, and neurocomputing, have been found to be helpful in providing low-cost, acceptable solutions in the presence of various types of uncertainties. In this paper, we survey the role of these soft methodologies and their hybridizations, for the purpose of generating genetic networks.

  14. Nuclear microprobe imaging of gallium nitrate in cancer cells

    NASA Astrophysics Data System (ADS)

    Ortega, Richard; Suda, Asami; Devès, Guillaume

    2003-09-01

    Gallium nitrate is used in clinical oncology as treatment for hypercalcemia and for cancer that has spread to the bone. Its mechanism of antitumor action has not been fully elucidated yet. The knowledge of the intracellular distribution of anticancer drugs is of particular interest in oncology to better understand their cellular pharmacology. In addition, most metal-based anticancer compounds interact with endogenous trace elements in cells, altering their metabolism. The purpose of this experiment was to examine, by use of nuclear microprobe analysis, the cellular distribution of gallium and endogenous trace elements within cancer cells exposed to gallium nitrate. In a majority of cellular analyses, gallium was found homogeneously distributed in cells following the distribution of carbon. In a smaller number of cells, however, gallium appeared concentrated together with P, Ca and Fe within round structures of about 2-5 μm diameter located in the perinuclear region. These intracellular structures are typical of lysosomial material.

  15. Lab-On-Chip Clinorotation System for Live-Cell Microscopy Under Simulated Microgravity

    NASA Technical Reports Server (NTRS)

    Yew, Alvin G.; Atencia, Javier; Chinn, Ben; Hsieh, Adam H.

    2013-01-01

    Cells in microgravity are subject to mechanical unloading and changes to the surrounding chemical environment. How these factors jointly influence cellular function is not well understood. We can investigate their role using ground-based analogues to spaceflight, where mechanical unloading is simulated through the time-averaged nullification of gravity. The prevailing method for cellular microgravity simulation is to use fluid-filled containers called clinostats. However, conventional clinostats are not designed for temporally tracking cell response, nor are they able to establish dynamic fluid environments. To address these needs, we developed a Clinorotation Time-lapse Microscopy (CTM) system that accommodates lab-on- chip cell culture devices for visualizing time-dependent alterations to cellular behavior. For the purpose of demonstrating CTM, we present preliminary results showing time-dependent differences in cell area between human mesenchymal stem cells (hMSCs) under modeled microgravity and normal gravity.

  16. Lab-On-Chip Clinorotation System for Live-Cell Microscopy Under Simulated Microgravity

    NASA Technical Reports Server (NTRS)

    Yew, Alvin G.; Atencia, Javier; Chinn, Ben; Hsieh, Adam H.

    1980-01-01

    Cells in microgravity are subject to mechanical unloading and changes to the surrounding chemical environment. How these factors jointly influence cellular function is not well understood. We can investigate their role using ground-based analogues to spaceflight, where mechanical unloading is simulated through the time-averaged nullification of gravity. The prevailing method for cellular microgravity simulation is to use fluid-filled containers called clinostats. However, conventional clinostats are not designed for temporally tracking cell response, nor are they able to establish dynamic fluid environments. To address these needs, we developed a Clinorotation Time-lapse Microscopy (CTM) system that accommodates lab-on- chip cell culture devices for visualizing time-dependent alterations to cellular behavior. For the purpose of demonstrating CTM, we present preliminary results showing time-dependent differences in cell area between human mesenchymal stem cells (hMSCs) under modeled microgravity and normal gravity.

  17. Are microRNAs true sensors of ageing and cellular senescence?

    PubMed

    Williams, Justin; Smith, Flint; Kumar, Subodh; Vijayan, Murali; Reddy, P Hemachandra

    2017-05-01

    All living beings are programmed to death due to aging and age-related processes. Aging is a normal process of every living species. While all cells are inevitably progressing towards death, many disease processes accelerate the aging process, leading to senescence. Pathologies such as Alzheimer's disease, Parkinson's disease, multiple sclerosis, amyotrophic lateral sclerosis, Huntington's disease, cardiovascular disease, cancer, and skin diseases have been associated with deregulated aging. Healthy aging can delay onset of all age-related diseases. Genetics and epigenetics are reported to play large roles in accelerating and/or delaying the onset of age-related diseases. Cellular mechanisms of aging and age-related diseases are not completely understood. However, recent molecular biology discoveries have revealed that microRNAs (miRNAs) are potential sensors of aging and cellular senescence. Due to miRNAs capability to bind to the 3' untranslated region (UTR) of mRNA of specific genes, miRNAs can prevent the translation of specific genes. The purpose of our article is to highlight recent advancements in miRNAs and their involvement in cellular changes in aging and senescence. Our article discusses the current understanding of cellular senescence, its interplay with miRNAs regulation, and how they both contribute to disease processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. SABRE: a bio-inspired fault-tolerant electronic architecture.

    PubMed

    Bremner, P; Liu, Y; Samie, M; Dragffy, G; Pipe, A G; Tempesti, G; Timmis, J; Tyrrell, A M

    2013-03-01

    As electronic devices become increasingly complex, ensuring their reliable, fault-free operation is becoming correspondingly more challenging. It can be observed that, in spite of their complexity, biological systems are highly reliable and fault tolerant. Hence, we are motivated to take inspiration for biological systems in the design of electronic ones. In SABRE (self-healing cellular architectures for biologically inspired highly reliable electronic systems), we have designed a bio-inspired fault-tolerant hierarchical architecture for this purpose. As in biology, the foundation for the whole system is cellular in nature, with each cell able to detect faults in its operation and trigger intra-cellular or extra-cellular repair as required. At the next level in the hierarchy, arrays of cells are configured and controlled as function units in a transport triggered architecture (TTA), which is able to perform partial-dynamic reconfiguration to rectify problems that cannot be solved at the cellular level. Each TTA is, in turn, part of a larger multi-processor system which employs coarser grain reconfiguration to tolerate faults that cause a processor to fail. In this paper, we describe the details of operation of each layer of the SABRE hierarchy, and how these layers interact to provide a high systemic level of fault tolerance.

  19. A comparative study on fluorescent cholesterol analogs as versatile cellular reporters[S

    PubMed Central

    Sezgin, Erdinc; Can, Fatma Betul; Schneider, Falk; Clausen, Mathias P.; Galiani, Silvia; Stanly, Tess A.; Waithe, Dominic; Colaco, Alexandria; Honigmann, Alf; Wüstner, Daniel; Platt, Frances; Eggeling, Christian

    2016-01-01

    Cholesterol (Chol) is a crucial component of cellular membranes, but knowledge of its intracellular dynamics is scarce. Thus, it is of utmost interest to develop tools for visualization of Chol organization and dynamics in cells and tissues. For this purpose, many studies make use of fluorescently labeled Chol analogs. Unfortunately, the introduction of the label may influence the characteristics of the analog, such as its localization, interaction, and trafficking in cells; hence, it is important to get knowledge of such bias. In this report, we compared different fluorescent lipid analogs for their performance in cellular assays: 1) plasma membrane incorporation, specifically the preference for more ordered membrane environments in phase-separated giant unilamellar vesicles and giant plasma membrane vesicles; 2) cellular trafficking, specifically subcellular localization in Niemann-Pick type C disease cells; and 3) applicability in fluorescence correlation spectroscopy (FCS)-based and super-resolution stimulated emission depletion-FCS-based measurements of membrane diffusion dynamics. The analogs exhibited strong differences, with some indicating positive performance in the membrane-based experiments and others in the intracellular trafficking assay. However, none showed positive performance in all assays. Our results constitute a concise guide for the careful use of fluorescent Chol analogs in visualizing cellular Chol dynamics. PMID:26701325

  20. Monte Carlo Simulation for Perusal and Practice.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.; Robey, Randall R.

    The meaningful investigation of many problems in statistics can be solved through Monte Carlo methods. Monte Carlo studies can help solve problems that are mathematically intractable through the analysis of random samples from populations whose characteristics are known to the researcher. Using Monte Carlo simulation, the values of a statistic are…

  1. (U) Introduction to Monte Carlo Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungerford, Aimee L.

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  2. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  3. Diffusing-wave polarimetry for tissue diagnostics

    NASA Astrophysics Data System (ADS)

    Macdonald, Callum; Doronin, Alexander; Peña, Adrian F.; Eccles, Michael; Meglinski, Igor

    2014-03-01

    We exploit the directional awareness of circularly and/or elliptically polarized light propagating within media which exhibit high numbers of scattering events. By tracking the Stokes vector of the detected light on the Poincaŕe sphere, we demonstrate its applicability for characterization of anisotropy of scattering. A phenomenological model is shown to have an excellent agreement with the experimental data and with the results obtained by the polarization tracking Monte Carlo model, developed in house. By analogy to diffusing-wave spectroscopy we call this approach diffusing-wave polarimetry, and illustrate its utility in probing cancerous and non-cancerous tissue samplesin vitro for diagnostic purposes.

  4. A Markov Chain-based quantitative study of angular distribution of photons through turbid slabs via isotropic light scattering

    NASA Astrophysics Data System (ADS)

    Li, Xuesong; Northrop, William F.

    2016-04-01

    This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.

  5. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Helba, Michael J.; Hill, Janeil B.

    1992-01-01

    The purpose of this research is to provide Space Station Freedom protective structures design insight through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. The goals of the research are: (1) to develop a Monte Carlo simulation tool which will provide top level insight for Space Station protective structures designers; (2) to develop advanced shielding concepts relevant to Space Station Freedom using unique multiple bumper approaches; and (3) to investigate projectile shape effects on protective structures design.

  6. Development of MMC Gamma Detectors for Nuclear Analysis

    NASA Astrophysics Data System (ADS)

    Bates, C. R.; Pies, C.; Kempf, S.; Gastaldo, L.; Fleischmann, A.; Enss, C.; Friedrich, S.

    2014-09-01

    Non-destructive assay (NDA) of nuclear materials would benefit from gamma detectors with improved energy resolution in cases where line overlap in current Ge detectors limits NDA accuracy. We are developing metallic magnetic calorimeter gamma-detectors for this purpose by electroplating 150 m thick Au absorbers into microfabricated molds on top of Au:Er sensors. Initial tests under non-optimized conditions show an energy resolution of 200 eV FWHM at 60 keV. Monte Carlo simulations illustrate that this resolution is starting to be sufficient for direct detection of Pu in plutonium separated from spent nuclear fuel.

  7. LES, DNS and RANS for the analysis of high-speed turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Adumitroaie, V.; Colucci, P. J.; Taulbee, D. B.; Givi, P.

    1995-01-01

    The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Aug. 1994 - 31 Jul. 1995, we have focused our efforts on two programs: (1) developments of explicit algebraic moment closures for statistical descriptions of compressible reacting flows and (2) development of Monte Carlo numerical methods for LES of chemically reacting flows.

  8. Estimating a graphical intra-class correlation coefficient (GICC) using multivariate probit-linear mixed models.

    PubMed

    Yue, Chen; Chen, Shaojie; Sair, Haris I; Airan, Raag; Caffo, Brian S

    2015-09-01

    Data reproducibility is a critical issue in all scientific experiments. In this manuscript, the problem of quantifying the reproducibility of graphical measurements is considered. The image intra-class correlation coefficient (I2C2) is generalized and the graphical intra-class correlation coefficient (GICC) is proposed for such purpose. The concept for GICC is based on multivariate probit-linear mixed effect models. A Markov Chain Monte Carlo EM (mcm-cEM) algorithm is used for estimating the GICC. Simulation results with varied settings are demonstrated and our method is applied to the KIRBY21 test-retest dataset.

  9. Design of a setup for 252Cf neutron source for storage and analysis purpose

    NASA Astrophysics Data System (ADS)

    Hei, Daqian; Zhuang, Haocheng; Jia, Wenbao; Cheng, Can; Jiang, Zhou; Wang, Hongtao; Chen, Da

    2016-11-01

    252Cf is a reliable isotopic neutron source and widely used in the prompt gamma ray neutron activation analysis (PGNAA) technique. A cylindrical barrel made by polymethyl methacrylate contained with the boric acid solution was designed for storage and application of a 5 μg 252Cf neutron source. The size of the setup was optimized with Monte Carlo code. The experiments were performed and the results showed the doses were reduced with the setup and less than the allowable limit. The intensity and collimating radius of the neutron beam could also be adjusted through different collimator.

  10. Comparison between different adsorption-desorption kinetics schemes in two dimensional lattice gas

    NASA Astrophysics Data System (ADS)

    Huespe, V. J.; Belardinelli, R. E.; Pereyra, V. D.; Manzi, S. J.

    2017-12-01

    Monte Carlo simulation is used to study the adsorption-desorption kinetics in the framework of the kinetic lattice-gas model. Three schemes of the so-called hard dynamics and five schemes of the so called soft dynamics were used for this purpose. It is observed that for the hard dynamic schemes, the equilibrium and non-equilibrium observable, such as adsorption isotherms, sticking coefficients, and thermal desorption spectra, have a normal or physical sustainable behavior. While for the soft dynamics schemes, with the exception of the transition state theory, the equilibrium and non-equilibrium observables have several problems.

  11. Information dynamics in carcinogenesis and tumor growth.

    PubMed

    Gatenby, Robert A; Frieden, B Roy

    2004-12-21

    The storage and transmission of information is vital to the function of normal and transformed cells. We use methods from information theory and Monte Carlo theory to analyze the role of information in carcinogenesis. Our analysis demonstrates that, during somatic evolution of the malignant phenotype, the accumulation of genomic mutations degrades intracellular information. However, the degradation is constrained by the Darwinian somatic ecology in which mutant clones proliferate only when the mutation confers a selective growth advantage. In that environment, genes that normally decrease cellular proliferation, such as tumor suppressor or differentiation genes, suffer maximum information degradation. Conversely, those that increase proliferation, such as oncogenes, are conserved or exhibit only gain of function mutations. These constraints shield most cellular populations from catastrophic mutator-induced loss of the transmembrane entropy gradient and, therefore, cell death. The dynamics of constrained information degradation during carcinogenesis cause the tumor genome to asymptotically approach a minimum information state that is manifested clinically as dedifferentiation and unconstrained proliferation. Extreme physical information (EPI) theory demonstrates that altered information flow from cancer cells to their environment will manifest in-vivo as power law tumor growth with an exponent of size 1.62. This prediction is based only on the assumption that tumor cells are at an absolute information minimum and are capable of "free field" growth that is, they are unconstrained by external biological parameters. The prediction agrees remarkably well with several studies demonstrating power law growth in small human breast cancers with an exponent of 1.72+/-0.24. This successful derivation of an analytic expression for cancer growth from EPI alone supports the conceptual model that carcinogenesis is a process of constrained information degradation and that malignant cells are minimum information systems. EPI theory also predicts that the estimated age of a clinically observed tumor is subject to a root-mean square error of about 30%. This is due to information loss and tissue disorganization and probably manifests as a randomly variable lag phase in the growth pattern that has been observed experimentally. This difference between tumor size and age may impose a fundamental limit on the efficacy of screening based on early detection of small tumors. Independent of the EPI analysis, Monte Carlo methods are applied to predict statistical tumor growth due to perturbed information flow from the environment into transformed cells. A "simplest" Monte Carlo model is suggested by the findings in the EPI approach that tumor growth arises out of a minimally complex mechanism. The outputs of large numbers of simulations show that (a) about 40% of the populations do not survive the first two-generations due to mutations in critical gene segments; but (b) those that do survive will experience power law growth identical to the predicted rate obtained from the independent EPI approach. The agreement between these two very different approaches to the problem strongly supports the idea that tumor cells regress to a state of minimum information during carcinogenesis, and that information dynamics are integrally related to tumor development and growth.

  12. Effects of Teacher Controlled Segmented-Animation Presentation in Facilitating Learning

    ERIC Educational Resources Information Center

    Mohamad Ali, Ahmad Zamzuri

    2010-01-01

    The aim of this research was to study the effectiveness of teacher controlled segmented-animation presentation on learning achievement of students with lower level of prior knowledge. Segmented-animation and continuous-animation courseware showing cellular signal transmission process were developed for the research purpose. Pre-test and post-test…

  13. Profiling bioenergetics and metabolic stress in cells derived from commercially important fish species

    USDA-ARS?s Scientific Manuscript database

    As organisms intimately associated with their environment, fish are sensitive to numerous environmental insults which can negatively affect their cellular physiology. For our purposes, fish subject to intensive farming practices can experience a host of acute and chronic stressors such as changes in...

  14. Bioenergetic phenotypes and metabolic stress responses in cells derived from ecologically and commercially important fish species

    USDA-ARS?s Scientific Manuscript database

    As organisms intimately associated with their environment, fish are sensitive to numerous environmental insults which can negatively affect their cellular physiology. For our purposes, fish subject to intensive farming practices can experience a host of acute and chronic stressors such as changes in...

  15. SU-F-T-217: A Comprehensive Monte-Carlo Study of Out-Of-Field Secondary Neutron Spectra in a Scanned-Beam Proton Therapy Treatment Room

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Englbrecht, F; Parodi, K; Trinkl, S

    2016-06-15

    Purpose: To simulate secondary neutron radiation-fields produced at different positions during phantom irradiation inside a scanning proton therapy gantry treatment room. Further, to identify origin, energy distribution and angular emission as function of proton beam energy. Methods: GEANT4 and FLUKA Monte-Carlo codes were used to model the relevant parts of the treatment room in a gantry-equipped pencil beam scanning proton therapy facility including walls, floor, metallic gantry-components, patient table and the homogeneous PMMA target. The proton beams were modeled based on experimental beam ranges in water and spot shapes in air. Neutron energy spectra were simulated at 0°, 45°, 90°more » and 135° relative to the beam axis at 2m distance from isocenter, as well as 11×11 cm2 fields for 75MeV, 140MeV, 200MeV and for 118MeV with 5cm PMMA range-shifter. The total neutron energy distribution was recorded for these four positions and proton energies. Additionally, the room-components generating secondary neutrons in the room and their contributions to the total spectrum were identified and quantified. Results: FLUKA and GEANT4 simulated neutron spectra showed good general agreement in the whole energy range of 10{sup −}9 to 10{sup 2} MeV. Comparison of measured spectra with the simulated contributions of the various room components helped to limit the complexity of the room model, by identifying the dominant contributions to the secondary neutron spectrum. The iron of the bending magnet and counterweight were identified as sources of secondary evaporation-neutrons, which were lacking in simplified room models. Conclusion: Thorough Monte-Carlo simulations have been performed to complement Bonner-sphere spectrometry measurements of secondary neutrons in a clinical proton therapy treatment room. Such calculations helped disentangling the origin of secondary neutrons and their dominant contributions to measured spectra, besides providing a useful validation of widely used Monte-Carlo packages in comparison to experimental data. Cluster of Excellence of the German Research Foundation (DFG) “Munich-Centre for Advanced Photonics (MAP)”.« less

  16. Fast Biological Modeling for Voxel-based Heavy Ion Treatment Planning Using the Mechanistic Repair-Misrepair-Fixation Model and Nuclear Fragment Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamp, Florian; Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München; Physik-Department, Technische Universität München, Garching

    2015-11-01

    Purpose: The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Methods and Materials: Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damagemore » simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. Results: We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β){sub X} = 2 Gy. Conclusions: These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from the first biological optimization of carbon ion radiation therapy beams on patient data using a combined RMF and Monte Carlo damage simulation modeling approach. The presented method is advantageous for fast biological optimization.« less

  17. Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bretin, Florian; Bahri, Mohamed Ali; Luxen, André

    Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulationsmore » in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120 microCT, which might alter physiological parameters in a longitudinal study setup.« less

  18. SU-E-T-297: Dosimetric Assessment of An Air-Filled Balloon Applicator in HDR Vaginal Cuff Brachytherapy Using the Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, H; Lee, Y; Pokhrel, D

    2015-06-15

    Purpose: As an alternative to cylindrical applicators, air inflated balloon applicators have been introduced into HDR vaginal cuff brachytherapy treatment to achieve sufficient dose to vagina mucosa as well as to spare rectum and bladder. In general, TG43 formulae based treatment planning systems do not take into account tissue inhomogeneity, and air in the balloon applicator can cause higher delivered dose to mucosa than treatment plan reported. We investigated dosimetric effect of air in balloon applicator using the Monte Carlo method. Methods: The thirteen-catheter Capri applicator with a Nucletron Ir-192 seed was modeled for various balloon diameters (2cm to 3.5cm)more » using the MCNP Monte Carlo code. Ir-192 seed was placed in both central and peripheral catheters to replicate real patient situations. Existence of charged particle equilibrium (CPE) with air balloon was evaluated by comparing kerma and dose at various distances (1mm to 70mm) from surface of air-filled applicator. Also mucosa dose by an air-filled applicator was compared with by a water-filled applicator to evaluate dosimetry accuracy of planning system without tissue inhomogeneity correction. Results: Beyond 1mm from air/tissue interface, the difference between kerma and dose was within 2%. CPE (or transient CPE) condition was deemed existent, and in this region no electron transport was necessary in Monte Carlo simulations. At 1mm or less, the deviation of dose from kerma became more apparent. Increase of dose to mucosa depended on diameter of air balloon. The increment of dose to mucosa was 2.5% and 4.3% on average for 2cm and 3.5cm applicators, respectively. Conclusion: After introduction of air balloon applicator, CPE fails only at the proximity of air/tissue interface. Although dose to mucosa is increased, there is no significant dosimetric difference (<5%) between air and water filled applicators. Tissue inhomogeneity correction is not necessary for air-filled applicators.« less

  19. SU-C-201-06: Utility of Quantitative 3D SPECT/CT Imaging in Patient Specific Internal Dosimetry of 153-Samarium with GATE Monte Carlo Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fallahpoor, M; Abbasi, M; Sen, A

    Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-Tmore » scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning on a day to day basis.« less

  20. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, T; Lin, H; Xu, X

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analyticallymore » derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.« less

  1. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  2. SU-E-T-552: Monte Carlo Calculation of Correction Factors for a Free-Air Ionization Chamber in Support of a National Air-Kerma Standard for Electronic Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Bergstrom, P

    2015-06-15

    Purpose: To use Monte Carlo radiation transport methods to calculate correction factors for a free-air ionization chamber in support of a national air-kerma standard for low-energy, miniature x-ray sources used for electronic brachytherapy (eBx). Methods: The NIST is establishing a calibration service for well-type ionization chambers used to characterize the strength of eBx sources prior to clinical use. The calibration approach involves establishing the well-chamber’s response to an eBx source whose air-kerma rate at a 50 cm distance is determined through a primary measurement performed using the Lamperti free-air ionization chamber. However, the free-air chamber measurements of charge or currentmore » can only be related to the reference air-kerma standard after applying several corrections, some of which are best determined via Monte Carlo simulation. To this end, a detailed geometric model of the Lamperti chamber was developed in the EGSnrc code based on the engineering drawings of the instrument. The egs-fac user code in EGSnrc was then used to calculate energy-dependent correction factors which account for missing or undesired ionization arising from effects such as: (1) attenuation and scatter of the x-rays in air; (2) primary electrons escaping the charge collection region; (3) lack of charged particle equilibrium; (4) atomic fluorescence and bremsstrahlung radiation. Results: Energy-dependent correction factors were calculated assuming a monoenergetic point source with the photon energy ranging from 2 keV to 60 keV in 2 keV increments. Sufficient photon histories were simulated so that the Monte Carlo statistical uncertainty of the correction factors was less than 0.01%. The correction factors for a specific eBx source will be determined by integrating these tabulated results over its measured x-ray spectrum. Conclusion: The correction factors calculated in this work are important for establishing a national standard for eBx which will help ensure that dose is accurately and consistently delivered to patients.« less

  3. SU-E-J-100: Reconstruction of Prompt Gamma Ray Three Dimensional SPECT Image From Boron Neutron Capture Therapy(BNCT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, D; Jung, J; Suh, T

    2014-06-01

    Purpose: Purpose of paper is to confirm the feasibility of acquisition of three dimensional single photon emission computed tomography (SPECT) image from boron neutron capture therapy (BNCT) using Monte Carlo simulation. Methods: In case of simulation, the pixelated SPECT detector, collimator and phantom were simulated using Monte Carlo n particle extended (MCNPX) simulation tool. A thermal neutron source (<1 eV) was used to react with the boron uptake region (BUR) in the phantom. Each geometry had a spherical pattern, and three different BURs (A, B and C region, density: 2.08 g/cm3) were located in the middle of the brain phantom.more » The data from 128 projections for each sorting process were used to achieve image reconstruction. The ordered subset expectation maximization (OSEM) reconstruction algorithm was used to obtain a tomographic image with eight subsets and five iterations. The receiver operating characteristic (ROC) curve analysis was used to evaluate the geometric accuracy of reconstructed image. Results: The OSEM image was compared with the original phantom pattern image. The area under the curve (AUC) was calculated as the gross area under each ROC curve. The three calculated AUC values were 0.738 (A region), 0.623 (B region), and 0.817 (C region). The differences between length of centers of two boron regions and distance of maximum count points were 0.3 cm, 1.6 cm and 1.4 cm. Conclusion: The possibility of extracting a 3D BNCT SPECT image was confirmed using the Monte Carlo simulation and OSEM algorithm. The prospects for obtaining an actual BNCT SPECT image were estimated from the quality of the simulated image and the simulation conditions. When multiple tumor region should be treated using the BNCT, a reasonable model to determine how many useful images can be obtained from the SPECT could be provided to the BNCT facilities. This research was supported by the Leading Foreign Research Institute Recruitment Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, Information and Communication Technologies (ICT) and Future Planning (MSIP)(Grant No.200900420) and the Radiation Technology Research and Development program (Grant No.2013043498), Republic of Korea.« less

  4. TU-EF-204-01: Accurate Prediction of CT Tube Current Modulation: Estimating Tube Current Modulation Schemes for Voxelized Patient Models Used in Monte Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMillan, K; Bostani, M; McNitt-Gray, M

    2015-06-15

    Purpose: Most patient models used in Monte Carlo-based estimates of CT dose, including computational phantoms, do not have tube current modulation (TCM) data associated with them. While not a problem for fixed tube current simulations, this is a limitation when modeling the effects of TCM. Therefore, the purpose of this work was to develop and validate methods to estimate TCM schemes for any voxelized patient model. Methods: For 10 patients who received clinically-indicated chest (n=5) and abdomen/pelvis (n=5) scans on a Siemens CT scanner, both CT localizer radiograph (“topogram”) and image data were collected. Methods were devised to estimate themore » complete x-y-z TCM scheme using patient attenuation data: (a) available in the Siemens CT localizer radiograph/topogram itself (“actual-topo”) and (b) from a simulated topogram (“sim-topo”) derived from a projection of the image data. For comparison, the actual TCM scheme was extracted from the projection data of each patient. For validation, Monte Carlo simulations were performed using each TCM scheme to estimate dose to the lungs (chest scans) and liver (abdomen/pelvis scans). Organ doses from simulations using the actual TCM were compared to those using each of the estimated TCM methods (“actual-topo” and “sim-topo”). Results: For chest scans, the average differences between doses estimated using actual TCM schemes and estimated TCM schemes (“actual-topo” and “sim-topo”) were 3.70% and 4.98%, respectively. For abdomen/pelvis scans, the average differences were 5.55% and 6.97%, respectively. Conclusion: Strong agreement between doses estimated using actual and estimated TCM schemes validates the methods for simulating Siemens topograms and converting attenuation data into TCM schemes. This indicates that the methods developed in this work can be used to accurately estimate TCM schemes for any patient model or computational phantom, whether a CT localizer radiograph is available or not. Funding Support: NIH Grant R01-EB017095; Disclosures - Michael McNitt-Gray: Institutional Research Agreement, Siemens AG; Research Support, Siemens AG; Consultant, Flaherty Sensabaugh Bonasso PLLC; Consultant, Fulbright and Jaworski; Disclosures - Cynthia McCollough: Research Grant, Siemens Healthcare.« less

  5. A surgical confocal microlaparoscope for real-time optical biopsies

    NASA Astrophysics Data System (ADS)

    Tanbakuchi, Anthony Amir

    The first real-time fluorescence confocal microlaparoscope has been developed that provides instant in vivo cellular images, comparable to those provided by histology, through a nondestructive procedure. The device includes an integrated contrast agent delivery mechanism and a computerized depth scan system. The instrument uses a fiber bundle to relay the image plane of a slit-scan confocal microlaparoscope into tissue. The confocal laparoscope was used to image the ovaries of twenty-one patients in vivo using fluorescein sodium and acridine orange as the fluorescent contrast agents. The results indicate that the device is safe and functions as designed. A Monte Carlo model was developed to characterize the system performance in a scattering media representative of human tissues. The results indicate that a slit aperture has limited ability to image below the surface of tissue. In contrast, the results show that multi-pinhole apertures such as a Nipkow disk or a linear pinhole array can achieve nearly the same depth performance as a single pinhole aperture. The model was used to determine the optimal aperture spacing for the multi-pinhole apertures. The confocal microlaparoscope represents a new type of in vivo imaging device. With its ability to image cellular details in real time, it has the potential to aid in the early diagnosis of cancer. Initially, the device may be used to locate unusual regions for guided biopsies. In the long term, the device may be able to supplant traditional biopsies and allow the surgeon to identify early stage cancer in vivo.

  6. A white-box model of S-shaped and double S-shaped single-species population growth

    PubMed Central

    Kalmykov, Lev V.

    2015-01-01

    Complex systems may be mechanistically modelled by white-box modeling with using logical deterministic individual-based cellular automata. Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models). Most basic ecological models are of black-box type, including Malthusian, Verhulst, Lotka–Volterra models. In black-box models, the individual-based (mechanistic) mechanisms of population dynamics remain hidden. Here we mechanistically model the S-shaped and double S-shaped population growth of vegetatively propagated rhizomatous lawn grasses. Using purely logical deterministic individual-based cellular automata we create a white-box model. From a general physical standpoint, the vegetative propagation of plants is an analogue of excitation propagation in excitable media. Using the Monte Carlo method, we investigate a role of different initial positioning of an individual in the habitat. We have investigated mechanisms of the single-species population growth limited by habitat size, intraspecific competition, regeneration time and fecundity of individuals in two types of boundary conditions and at two types of fecundity. Besides that, we have compared the S-shaped and J-shaped population growth. We consider this white-box modeling approach as a method of artificial intelligence which works as automatic hyper-logical inference from the first principles of the studied subject. This approach is perspective for direct mechanistic insights into nature of any complex systems. PMID:26038717

  7. Mechanism of Facilitated Diffusion during a DNA Search in Crowded Environments.

    PubMed

    Krepel, Dana; Gomez, David; Klumpp, Stefan; Levy, Yaakov

    2016-11-03

    The key feature explaining the rapid recognition of a DNA target site by its protein lies in the combination of one- and three-dimensional (1D and 3D) diffusion, which allows efficient scanning of the many alternative sites. This facilitated diffusion mechanism is expected to be affected by cellular conditions, particularly crowding, given that up to 40% of the total cellular volume may by occupied by macromolecules. Using coarse-grained molecular dynamics and Monte Carlo simulations, we show that the crowding particles can enhance facilitated diffusion and accelerate search kinetics. This effect originates from a trade-off between 3D and 1D diffusion. The 3D diffusion coefficient is lower under crowded conditions, but it has little influence because the excluded volume effect of molecular crowding restricts its use. Largely prevented from using 3D diffusion, the searching protein dramatically increases its use of the hopping search mode, which results in a higher linear diffusion coefficient. The coefficient of linear diffusion also increases under crowded conditions as a result of increased collisions between the crowding particles and the searching protein. Overall, less 3D diffusion coupled with an increase in the use of the hopping and speed of 1D diffusion results in faster search kinetics under crowded conditions. Our study shows that the search kinetics and mechanism are modulated not only by the crowding occupancy but also by the properties of the crowding particles and the salt concentration.

  8. Spatiotemporal dynamics of landscape pattern and hydrologic process in watershed systems

    NASA Astrophysics Data System (ADS)

    Randhir, Timothy O.; Tsvetkova, Olga

    2011-06-01

    SummaryLand use change is influenced by spatial and temporal factors that interact with watershed resources. Modeling these changes is critical to evaluate emerging land use patterns and to predict variation in water quantity and quality. The objective of this study is to model the nature and emergence of spatial patterns in land use and water resource impacts using a spatially explicit and dynamic landscape simulation. Temporal changes are predicted using a probabilistic Markovian process and spatial interaction through cellular automation. The MCMC (Monte Carlo Markov Chain) analysis with cellular automation is linked to hydrologic equations to simulate landscape patterns and processes. The spatiotemporal watershed dynamics (SWD) model is applied to a subwatershed in the Blackstone River watershed of Massachusetts to predict potential land use changes and expected runoff and sediment loading. Changes in watershed land use and water resources are evaluated over 100 years at a yearly time step. Results show high potential for rapid urbanization that could result in lowering of groundwater recharge and increased storm water peaks. The watershed faces potential decreases in agricultural and forest area that affect open space and pervious cover of the watershed system. Water quality deteriorated due to increased runoff which can also impact stream morphology. While overland erosion decreased, instream erosion increased from increased runoff from urban areas. Use of urban best management practices (BMPs) in sensitive locations, preventive strategies, and long-term conservation planning will be useful in sustaining the watershed system.

  9. Towards the estimation of the scattered energy spectra reaching the head of the medical staff during interventional radiology: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Zagorska, A.; Bliznakova, K.; Buchakliev, Z.

    2015-09-01

    In 2012, the International Commission on Radiological Protection has recommended a reduction of the dose limits to the eye lens for occupational exposure. Recent studies showed that in interventional rooms is possible to reach these limits especially without using protective equipment. The aim of this study was to calculate the scattered energy spectra distribution at the level of the operator's head. For this purpose, an in-house developed Monte Carlo-based computer application was used to design computational phantoms (patient and operator), the acquisition geometry as well as to simulate the photon transport through the designed system. The initial spectra from 70 kV tube voltage and 8 different filtrations were calculated according to the IPEM Report 78. An experimental study was carried out to verify the results from the simulations. The calculated scattered radiation distributions were compared to the initial incident on the patient spectra. Results showed that there is no large difference between the effective energies of the scattered spectra registered in front of the operator's head obtained from simulations of all 8 incident spectra. The results from the experimental study agreed well to simulations as well.

  10. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  11. CT-based MCNPX dose calculations for gynecology brachytherapy employing a Henschke applicator

    NASA Astrophysics Data System (ADS)

    Yu, Pei-Chieh; Nien, Hsin-Hua; Tung, Chuan-Jong; Lee, Hsing-Yi; Lee, Chung-Chi; Wu, Ching-Jung; Chao, Tsi-Chian

    2017-11-01

    The purpose of this study is to investigate the dose perturbation caused by the metal ovoid structures of a Henschke applicator using Monte Carlo simulation in a realistic phantom. The Henschke applicator has been widely used for gynecologic patients treated by brachytherapy in Taiwan. However, the commercial brachytherapy planning system (BPS) did not properly evaluate the dose perturbation caused by its metal ovoid structures. In this study, Monte Carlo N-Particle Transport Code eXtended (MCNPX) was used to evaluate the brachytherapy dose distribution of a Henschke applicator embedded in a Plastic water phantom and a heterogeneous patient computed tomography (CT) phantom. The dose comparison between the MC simulations and film measurements for a Plastic water phantom with Henschke applicator were in good agreement. However, MC dose with the Henschke applicator showed significant deviation (-80.6%±7.5%) from those without Henschke applicator. Furthermore, the dose discrepancy in the heterogeneous patient CT phantom and Plastic water phantom CT geometries with Henschke applicator showed 0 to -26.7% dose discrepancy (-8.9%±13.8%). This study demonstrates that the metal ovoid structures of Henschke applicator cannot be disregard in brachytherapy dose calculation.

  12. Monte Carlo Estimation of Absorbed Dose Distributions Obtained from Heterogeneous 106Ru Eye Plaques.

    PubMed

    Zaragoza, Francisco J; Eichmann, Marion; Flühs, Dirk; Sauerwein, Wolfgang; Brualla, Lorenzo

    2017-09-01

    The distribution of the emitter substance in 106 Ru eye plaques is usually assumed to be homogeneous for treatment planning purposes. However, this distribution is never homogeneous, and it widely differs from plaque to plaque due to manufacturing factors. By Monte Carlo simulation of radiation transport, we study the absorbed dose distribution obtained from the specific CCA1364 and CCB1256 106 Ru plaques, whose actual emitter distributions were measured. The idealized, homogeneous CCA and CCB plaques are also simulated. The largest discrepancy in depth dose distribution observed between the heterogeneous and the homogeneous plaques was 7.9 and 23.7% for the CCA and CCB plaques, respectively. In terms of isodose lines, the line referring to 100% of the reference dose penetrates 0.2 and 1.8 mm deeper in the case of heterogeneous CCA and CCB plaques, respectively, with respect to the homogeneous counterpart. The observed differences in absorbed dose distributions obtained from heterogeneous and homogeneous plaques are clinically irrelevant if the plaques are used with a lateral safety margin of at least 2 mm. However, these differences may be relevant if the plaques are used in eccentric positioning.

  13. Evaluation and optimization of sampling errors for the Monte Carlo Independent Column Approximation

    NASA Astrophysics Data System (ADS)

    Räisänen, Petri; Barker, W. Howard

    2004-07-01

    The Monte Carlo Independent Column Approximation (McICA) method for computing domain-average broadband radiative fluxes is unbiased with respect to the full ICA, but its flux estimates contain conditional random noise. McICA's sampling errors are evaluated here using a global climate model (GCM) dataset and a correlated-k distribution (CKD) radiation scheme. Two approaches to reduce McICA's sampling variance are discussed. The first is to simply restrict all of McICA's samples to cloudy regions. This avoids wasting precious few samples on essentially homogeneous clear skies. Clear-sky fluxes need to be computed separately for this approach, but this is usually done in GCMs for diagnostic purposes anyway. Second, accuracy can be improved by repeated sampling, and averaging those CKD terms with large cloud radiative effects. Although this naturally increases computational costs over the standard CKD model, random errors for fluxes and heating rates are reduced by typically 50% to 60%, for the present radiation code, when the total number of samples is increased by 50%. When both variance reduction techniques are applied simultaneously, globally averaged flux and heating rate random errors are reduced by a factor of #3.

  14. Distribution network design under demand uncertainty using genetic algorithm and Monte Carlo simulation approach: a case study in pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Izadi, Arman; Kimiagari, Ali mohammad

    2014-01-01

    Distribution network design as a strategic decision has long-term effect on tactical and operational supply chain management. In this research, the location-allocation problem is studied under demand uncertainty. The purposes of this study were to specify the optimal number and location of distribution centers and to determine the allocation of customer demands to distribution centers. The main feature of this research is solving the model with unknown demand function which is suitable with the real-world problems. To consider the uncertainty, a set of possible scenarios for customer demands is created based on the Monte Carlo simulation. The coefficient of variation of costs is mentioned as a measure of risk and the most stable structure for firm's distribution network is defined based on the concept of robust optimization. The best structure is identified using genetic algorithms and 14% reduction in total supply chain costs is the outcome. Moreover, it imposes the least cost variation created by fluctuation in customer demands (such as epidemic diseases outbreak in some areas of the country) to the logistical system. It is noteworthy that this research is done in one of the largest pharmaceutical distribution firms in Iran.

  15. Distribution network design under demand uncertainty using genetic algorithm and Monte Carlo simulation approach: a case study in pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Izadi, Arman; Kimiagari, Ali Mohammad

    2014-05-01

    Distribution network design as a strategic decision has long-term effect on tactical and operational supply chain management. In this research, the location-allocation problem is studied under demand uncertainty. The purposes of this study were to specify the optimal number and location of distribution centers and to determine the allocation of customer demands to distribution centers. The main feature of this research is solving the model with unknown demand function which is suitable with the real-world problems. To consider the uncertainty, a set of possible scenarios for customer demands is created based on the Monte Carlo simulation. The coefficient of variation of costs is mentioned as a measure of risk and the most stable structure for firm's distribution network is defined based on the concept of robust optimization. The best structure is identified using genetic algorithms and 14 % reduction in total supply chain costs is the outcome. Moreover, it imposes the least cost variation created by fluctuation in customer demands (such as epidemic diseases outbreak in some areas of the country) to the logistical system. It is noteworthy that this research is done in one of the largest pharmaceutical distribution firms in Iran.

  16. Studies on muon tomography for archaeological internal structures scanning

    NASA Astrophysics Data System (ADS)

    Gómez, H.; Carloganu, C.; Gibert, D.; Jacquemier, J.; Karyotakis, Y.; Marteau, J.; Niess, V.; Katsanevas, S.; Tonazzo, A.

    2016-05-01

    Muon tomography is a potential non-invasive technique for internal structure scanning. It has already interesting applications in geophysics and can be used for archaeological purposes. Muon tomography is based on the measurement of the muon flux after crossing the structure studied. Differences on the mean density of these structures imply differences on the detected muon rate for a given direction. Based on this principle, Monte Carlo simulations represent a useful tool to provide a model of the expected muon rate and angular distribution depending on the composition of the studied object, being useful to estimate the expected detected muons and to better understand the experimental results. These simulations are mainly dependent on the geometry and composition of the studied object and on the modelling of the initial muon flux at surface. In this work, the potential of muon tomography in archaeology is presented and evaluated with Monte Carlo simulations by estimating the differences on the muon rate due to the presence of internal structures and its composition. The influence of the chosen muon model at surface in terms of energy and angular distributions in the final result has been also studied.

  17. A systematic Monte Carlo simulation study of the primitive model planar electrical double layer over an extended range of concentrations, electrode charges, cation diameters and valences

    NASA Astrophysics Data System (ADS)

    Valiskó, Mónika; Kristóf, Tamás; Gillespie, Dirk; Boda, Dezső

    2018-02-01

    The purpose of this study is to provide data for the primitive model of the planar electrical double layer, where ions are modeled as charged hard spheres, the solvent as an implicit dielectric background (with dielectric constant ɛ = 78.5), and the electrode as a smooth, uniformly charged, hard wall. We use canonical and grand canonical Monte Carlo simulations to compute the concentration profiles, from which the electric field and electrostatic potential profiles are obtained by solving Poisson's equation. We report data for an extended range of parameters including 1:1, 2:1, and 3:1 electrolytes at concentrations c = 0.0001 - 1 M near electrodes carrying surface charges up to σ = ±0.5 Cm-2. The anions are monovalent with a fixed diameter d- = 3 Å, while the charge and diameter of cations are varied in the range z+ = 1, 2, 3 and d+ = 1.5, 3, 6, and 9 Å (the temperature is 298.15 K). We provide all the raw data in the supplementary material (ftp://ftp.aip.org/epaps/aip_advances/E-AAIDBI-8-084802">supplementary material).

  18. Validation of columnar CsI x-ray detector responses obtained with hybridMANTIS, a CPU-GPU Monte Carlo code for coupled x-ray, electron, and optical transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Diksha; Badano, Aldo

    2013-03-15

    Purpose: hybridMANTIS is a Monte Carlo package for modeling indirect x-ray imagers using columnar geometry based on a hybrid concept that maximizes the utilization of available CPU and graphics processing unit processors in a workstation. Methods: The authors compare hybridMANTIS x-ray response simulations to previously published MANTIS and experimental data for four cesium iodide scintillator screens. These screens have a variety of reflective and absorptive surfaces with different thicknesses. The authors analyze hybridMANTIS results in terms of modulation transfer function and calculate the root mean square difference and Swank factors from simulated and experimental results. Results: The comparison suggests thatmore » hybridMANTIS better matches the experimental data as compared to MANTIS, especially at high spatial frequencies and for the thicker screens. hybridMANTIS simulations are much faster than MANTIS with speed-ups up to 5260. Conclusions: hybridMANTIS is a useful tool for improved description and optimization of image acquisition stages in medical imaging systems and for modeling the forward problem in iterative reconstruction algorithms.« less

  19. Inferring soil salinity in a drip irrigation system from multi-configuration EMI measurements using adaptive Markov chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Zaib Jadoon, Khan; Umer Altaf, Muhammad; McCabe, Matthew Francis; Hoteit, Ibrahim; Muhammad, Nisar; Moghadas, Davood; Weihermüller, Lutz

    2017-10-01

    A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In MCMC the posterior distribution is computed using Bayes' rule. The electromagnetic forward model based on the full solution of Maxwell's equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD Mini-Explorer. Uncertainty in the parameters for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness as compared to layers electrical conductivity are not very informative and are therefore difficult to resolve. Application of the proposed MCMC-based inversion to field measurements in a drip irrigation system demonstrates that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provides useful insight about parameter uncertainty for the assessment of the model outputs.

  20. Identification of oil residues in Roman amphorae (Monte Testaccio, Rome): a comparative FTIR spectroscopic study of archeological and artificially aged samples.

    PubMed

    Tarquini, Gabriele; Nunziante Cesaro, Stella; Campanella, Luigi

    2014-01-01

    The application of Fourier Transform InfraRed (FTIR) spectroscopy to the analysis of oil residues in fragments of archeological amphorae (3rd century A.D.) from Monte Testaccio (Rome, Italy) is reported. In order to check the possibility to reveal the presence of oil residues in archeological pottery using microinvasive and\\or not invasive techniques, different approaches have been followed: firstly, FTIR spectroscopy was used to study oil residues extracted from roman amphorae. Secondly, the presence of oil residues was ascertained analyzing microamounts of archeological fragments with the Diffuse Reflectance Infrared Spectroscopy (DRIFT). Finally, the external reflection analysis of the ancient shards was performed without preliminary treatments evidencing the possibility to detect oil traces through the observation of the most intense features of its spectrum. Incidentally, the existence of carboxylate salts of fatty acids was also observed in DRIFT and Reflectance spectra of archeological samples supporting the roman habit of spreading lime over the spoil heaps. The data collected in all steps were always compared with results obtained on purposely made replicas. © 2013 Elsevier B.V. All rights reserved.

  1. Comparison of Monte Carlo simulated and measured performance parameters of miniPET scanner

    NASA Astrophysics Data System (ADS)

    Kis, S. A.; Emri, M.; Opposits, G.; Bükki, T.; Valastyán, I.; Hegyesi, Gy.; Imrek, J.; Kalinka, G.; Molnár, J.; Novák, D.; Végh, J.; Kerek, A.; Trón, L.; Balkay, L.

    2007-02-01

    In vivo imaging of small laboratory animals is a valuable tool in the development of new drugs. For this purpose, miniPET, an easy to scale modular small animal PET camera has been developed at our institutes. The system has four modules, which makes it possible to rotate the whole detector system around the axis of the field of view. Data collection and image reconstruction are performed using a data acquisition (DAQ) module with Ethernet communication facility and a computer cluster of commercial PCs. Performance tests were carried out to determine system parameters, such as energy resolution, sensitivity and noise equivalent count rate. A modified GEANT4-based GATE Monte Carlo software package was used to simulate PET data analogous to those of the performance measurements. GATE was run on a Linux cluster of 10 processors (64 bit, Xeon with 3.0 GHz) and controlled by a SUN grid engine. The application of this special computer cluster reduced the time necessary for the simulations by an order of magnitude. The simulated energy spectra, maximum rate of true coincidences and sensitivity of the camera were in good agreement with the measured parameters.

  2. SU-E-T-397: Evaluation of Planned Dose Distributions by Monte Carlo (0.5%) and Ray Tracing Algorithm for the Spinal Tumors with CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, H; Brindle, J; Hepel, J

    2015-06-15

    Purpose: To analyze and evaluate dose distribution between Ray Tracing (RT) and Monte Carlo (MC) algorithms of 0.5% uncertainty on a critical structure of spinal cord and gross target volume and planning target volume. Methods: Twenty four spinal tumor patients were treated with stereotactic body radiotherapy (SBRT) by CyberKnife in 2013 and 2014. The MC algorithm with 0.5% of uncertainty is used to recalculate the dose distribution for the treatment plan of the patients using the same beams, beam directions, and monitor units (MUs). Results: The prescription doses are uniformly larger for MC plans than RT except one case. Upmore » to a factor of 1.19 for 0.25cc threshold volume and 1.14 for 1.2cc threshold volume of dose differences are observed for the spinal cord. Conclusion: The MC recalculated dose distributions are larger than the original MC calculations for the spinal tumor cases. Based on the accuracy of the MC calculations, more radiation dose might be delivered to the tumor targets and spinal cords with the increase prescription dose.« less

  3. Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA

    PubMed Central

    Lee, Chaeyeong; Lee, Sangmin; Lee, Seung-Jae; Song, Hankyeol; Kim, Dae-Hyun; Cho, Sungkoo; Jo, Kwanghyun; Han, Youngyih; Chung, Yong Hyun

    2017-01-01

    Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC) by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1) was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators. PMID:29045491

  4. A Monte Carlo simulation framework for electron beam dose calculations using Varian phase space files for TrueBeam Linacs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodrigues, Anna; Yin, Fang-Fang; Wu, Qiuwen, E-mail: Qiuwen.Wu@Duke.edu

    2015-05-15

    Purpose: To develop a framework for accurate electron Monte Carlo dose calculation. In this study, comprehensive validations of vendor provided electron beam phase space files for Varian TrueBeam Linacs against measurement data are presented. Methods: In this framework, the Monte Carlo generated phase space files were provided by the vendor and used as input to the downstream plan-specific simulations including jaws, electron applicators, and water phantom computed in the EGSnrc environment. The phase space files were generated based on open field commissioning data. A subset of electron energies of 6, 9, 12, 16, and 20 MeV and open and collimatedmore » field sizes 3 × 3, 4 × 4, 5 × 5, 6 × 6, 10 × 10, 15 × 15, 20 × 20, and 25 × 25 cm{sup 2} were evaluated. Measurements acquired with a CC13 cylindrical ionization chamber and electron diode detector and simulations from this framework were compared for a water phantom geometry. The evaluation metrics include percent depth dose, orthogonal and diagonal profiles at depths R{sub 100}, R{sub 50}, R{sub p}, and R{sub p+} for standard and extended source-to-surface distances (SSD), as well as cone and cut-out output factors. Results: Agreement for the percent depth dose and orthogonal profiles between measurement and Monte Carlo was generally within 2% or 1 mm. The largest discrepancies were observed within depths of 5 mm from phantom surface. Differences in field size, penumbra, and flatness for the orthogonal profiles at depths R{sub 100}, R{sub 50}, and R{sub p} were within 1 mm, 1 mm, and 2%, respectively. Orthogonal profiles at SSDs of 100 and 120 cm showed the same level of agreement. Cone and cut-out output factors agreed well with maximum differences within 2.5% for 6 MeV and 1% for all other energies. Cone output factors at extended SSDs of 105, 110, 115, and 120 cm exhibited similar levels of agreement. Conclusions: We have presented a Monte Carlo simulation framework for electron beam dose calculations for Varian TrueBeam Linacs. Electron beam energies of 6 to 20 MeV for open and collimated field sizes from 3 × 3 to 25 × 25 cm{sup 2} were studied and results were compared to the measurement data with excellent agreement. Application of this framework can thus be used as the platform for treatment planning of dynamic electron arc radiotherapy and other advanced dynamic techniques with electron beams.« less

  5. Preliminary cellular-automata forecast of permit activity from 1998 to 2010, Idaho and Western Montana

    USGS Publications Warehouse

    Raines, G.L.; Zientek, M.L.; Causey, J.D.; Boleneus, D.E.

    2002-01-01

    For public land management in Idaho and western Montana, the U.S. Forest Service (USFS) has requested that the U.S. Geological Survey (USGS) predict where mineral-related activity will occur in the next decade. Cellular automata provide an approach to simulation of this human activity. Cellular automata (CA) are defined by an array of cells, which evolve by a simple transition rule, the automaton. Based on exploration trends, we assume that future exploration will focus in areas of past exploration. Spatial-temporal information about mineral-related activity, that is permits issued by USFS and Bureau of Land Management (BLM) in the last decade, and spatial information about undiscovered resources, provide a basis to calibrate a CA. The CA implemented is a modified annealed voting rule that simulates mineral-related activity with spatial and temporal resolution of 1 mi2 and 1 year based on activity from 1989 to 1998. For this CA, the state of the economy and exploration technology is assumed constant for the next decade. The calibrated CA reproduces the 1989-1998-permit activity with an agreement of 94%, which increases to 98% within one year. Analysis of the confusion matrix and kappa correlation statistics indicates that the CA underestimates high activity and overestimates low activity. Spatially, the major differences between the actual and calculated activity are that the calculated activity occurs in a slightly larger number of small patches and is slightly more uneven than the actual activity. Using the calibrated CA in a Monte Carlo simulation projecting from 1998 to 2010, an estimate of the probability of mineral activity shows high levels of activity in Boise, Caribou, Elmore, Lincoln, and western Valley counties in Idaho and Beaverhead, Madison, and Stillwater counties in Montana, and generally low activity elsewhere. ?? 2002 International Association for Mathematical Geology.

  6. Sputnik Planum, Pluto: Composition, Geology, and Origin

    NASA Astrophysics Data System (ADS)

    McKinnon, William B.; Moore, Jeffrey M.; Spencer, John R.; Singer, Kelsi N.; Protopapa, Silvia; Grundy, Will; White, Oliver; Schenk, Paul M.; Olkin, Catherine B.; Young, Leslie; Ennico, Kimberly; Weaver, Harold A.; Stern, S. Alan; New Horizons Geology, Geophysics, and Imaging Theme Team, New Horizons Composition Theme Team

    2016-10-01

    Large-grained nitrogen ice dominates Sputnik Planum (SP, all names herein being informal), both spectroscopically and rheologically, but spectroscopic evidence also exists for a considerable volume fraction of methane ice (Protopapa et al., Icarus, submitted). If true, this considerably broadens the range of possible viscosity contrasts controlling cellular convection within SP (see McKinnon et al., Nature 2016), while potentially complicating buoyancy arguments regarding the numerous "icebergs," especially for those at the western margin where the Hillary and Norgay Montes sources must be predominantly water-ice owing to their great topographic heights (Moore et al., Science 2016). Bergs carried into SP by glacial flow from the Tombaugh Regio uplands to the east must themselves also be erodible at the downwelling margins of convection cells, for otherwise the entire planum surface would become choked, Sargasso-like, over geologic time. Within SP, the cellular pattern loses its distinctive trough-bounded topographic signature towards the northwest, which is apparently not simply a solar incidence angle effect; this transition coincides with a lower surface N2 and greater CH4 abundance. Towards the south, the cellular pattern ceases, presumably due to a shallowing of the nitrogen-rich layer (which decreases the Rayleigh number, or convective drive), and which is consistent with the water-ice basement topography expected from an oblique, basin-forming impact on a sphere. The "stability" of the southern SP surface apparently promotes development of pits by sublimation, but both relict cell boundaries and pit ensembles show evidence of shear flow to the south. Upwelling centers within cells also show photometric evidence for elongation to the south, meaning these cells are not simply plumes, but longitudinal convective rolls. Simple scaling arguments suggest surface velocities on the order of 1 cm/yr to the south. This suggests a surface age for southern SP in excess of 10 Myr, but likely consistent with an impactor population deficient in smaller crater-forming bodies (see talk by Singer et al., this meeting).

  7. Integrating Intracellular Dynamics Using CompuCell3D and Bionetsolver: Applications to Multiscale Modelling of Cancer Cell Growth and Invasion

    PubMed Central

    Andasari, Vivi; Roper, Ryan T.; Swat, Maciej H.; Chaplain, Mark A. J.

    2012-01-01

    In this paper we present a multiscale, individual-based simulation environment that integrates CompuCell3D for lattice-based modelling on the cellular level and Bionetsolver for intracellular modelling. CompuCell3D or CC3D provides an implementation of the lattice-based Cellular Potts Model or CPM (also known as the Glazier-Graner-Hogeweg or GGH model) and a Monte Carlo method based on the metropolis algorithm for system evolution. The integration of CC3D for cellular systems with Bionetsolver for subcellular systems enables us to develop a multiscale mathematical model and to study the evolution of cell behaviour due to the dynamics inside of the cells, capturing aspects of cell behaviour and interaction that is not possible using continuum approaches. We then apply this multiscale modelling technique to a model of cancer growth and invasion, based on a previously published model of Ramis-Conde et al. (2008) where individual cell behaviour is driven by a molecular network describing the dynamics of E-cadherin and -catenin. In this model, which we refer to as the centre-based model, an alternative individual-based modelling technique was used, namely, a lattice-free approach. In many respects, the GGH or CPM methodology and the approach of the centre-based model have the same overall goal, that is to mimic behaviours and interactions of biological cells. Although the mathematical foundations and computational implementations of the two approaches are very different, the results of the presented simulations are compatible with each other, suggesting that by using individual-based approaches we can formulate a natural way of describing complex multi-cell, multiscale models. The ability to easily reproduce results of one modelling approach using an alternative approach is also essential from a model cross-validation standpoint and also helps to identify any modelling artefacts specific to a given computational approach. PMID:22461894

  8. A new multicompartmental reaction-diffusion modeling method links transient membrane attachment of E. coli MinE to E-ring formation.

    PubMed

    Arjunan, Satya Nanda Vel; Tomita, Masaru

    2010-03-01

    Many important cellular processes are regulated by reaction-diffusion (RD) of molecules that takes place both in the cytoplasm and on the membrane. To model and analyze such multicompartmental processes, we developed a lattice-based Monte Carlo method, Spatiocyte that supports RD in volume and surface compartments at single molecule resolution. Stochasticity in RD and the excluded volume effect brought by intracellular molecular crowding, both of which can significantly affect RD and thus, cellular processes, are also supported. We verified the method by comparing simulation results of diffusion, irreversible and reversible reactions with the predicted analytical and best available numerical solutions. Moreover, to directly compare the localization patterns of molecules in fluorescence microscopy images with simulation, we devised a visualization method that mimics the microphotography process by showing the trajectory of simulated molecules averaged according to the camera exposure time. In the rod-shaped bacterium Escherichia coli, the division site is suppressed at the cell poles by periodic pole-to-pole oscillations of the Min proteins (MinC, MinD and MinE) arising from carefully orchestrated RD in both cytoplasm and membrane compartments. Using Spatiocyte we could model and reproduce the in vivo MinDE localization dynamics by accounting for the previously reported properties of MinE. Our results suggest that the MinE ring, which is essential in preventing polar septation, is largely composed of MinE that is transiently attached to the membrane independently after recruited by MinD. Overall, Spatiocyte allows simulation and visualization of complex spatial and reaction-diffusion mediated cellular processes in volumes and surfaces. As we showed, it can potentially provide mechanistic insights otherwise difficult to obtain experimentally. The online version of this article (doi:10.1007/s11693-009-9047-2) contains supplementary material, which is available to authorized users.

  9. Recent advances and future prospects for Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B

    2010-01-01

    The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codesmore » such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.« less

  10. Development of a Space Radiation Monte Carlo Computer Simulation

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence S.

    1997-01-01

    The ultimate purpose of this effort is to undertake the development of a computer simulation of the radiation environment encountered in spacecraft which is based upon the Monte Carlo technique. The current plan is to adapt and modify a Monte Carlo calculation code known as FLUKA, which is presently used in high energy and heavy ion physics, to simulate the radiation environment present in spacecraft during missions. The initial effort would be directed towards modeling the MIR and Space Shuttle environments, but the long range goal is to develop a program for the accurate prediction of the radiation environment likely to be encountered on future planned endeavors such as the Space Station, a Lunar Return Mission, or a Mars Mission. The longer the mission, especially those which will not have the shielding protection of the earth's magnetic field, the more critical the radiation threat will be. The ultimate goal of this research is to produce a code that will be useful to mission planners and engineers who need to have detailed projections of radiation exposures at specified locations within the spacecraft and for either specific times during the mission or integrated over the entire mission. In concert with the development of the simulation, it is desired to integrate it with a state-of-the-art interactive 3-D graphics-capable analysis package known as ROOT, to allow easy investigation and visualization of the results. The efforts reported on here include the initial development of the program and the demonstration of the efficacy of the technique through a model simulation of the MIR environment. This information was used to write a proposal to obtain follow-on permanent funding for this project.

  11. Assessing the Clinical Impact of Approximations in Analytical Dose Calculations for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, Jan, E-mail: jschuemann@mgh.harvard.edu; Giantsoudi, Drosoula; Grassberger, Clemens

    2015-08-01

    Purpose: To assess the impact of approximations in current analytical dose calculation methods (ADCs) on tumor control probability (TCP) in proton therapy. Methods: Dose distributions planned with ADC were compared with delivered dose distributions as determined by Monte Carlo simulations. A total of 50 patients were investigated in this analysis with 10 patients per site for 5 treatment sites (head and neck, lung, breast, prostate, liver). Differences were evaluated using dosimetric indices based on a dose-volume histogram analysis, a γ-index analysis, and estimations of TCP. Results: We found that ADC overestimated the target doses on average by 1% to 2%more » for all patients considered. The mean dose, D95, D50, and D02 (the dose value covering 95%, 50% and 2% of the target volume, respectively) were predicted within 5% of the delivered dose. The γ-index passing rate for target volumes was above 96% for a 3%/3 mm criterion. Differences in TCP were up to 2%, 2.5%, 6%, 6.5%, and 11% for liver and breast, prostate, head and neck, and lung patients, respectively. Differences in normal tissue complication probabilities for bladder and anterior rectum of prostate patients were less than 3%. Conclusion: Our results indicate that current dose calculation algorithms lead to underdosage of the target by as much as 5%, resulting in differences in TCP of up to 11%. To ensure full target coverage, advanced dose calculation methods like Monte Carlo simulations may be necessary in proton therapy. Monte Carlo simulations may also be required to avoid biases resulting from systematic discrepancies in calculated dose distributions for clinical trials comparing proton therapy with conventional radiation therapy.« less

  12. TU-H-CAMPUS-IeP1-01: Bias and Computational Efficiency of Variance Reduction Methods for the Monte Carlo Simulation of Imaging Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, D; Badano, A; Sempau, J

    Purpose: Variance reduction techniques (VRTs) are employed in Monte Carlo simulations to obtain estimates with reduced statistical uncertainty for a given simulation time. In this work, we study the bias and efficiency of a VRT for estimating the response of imaging detectors. Methods: We implemented Directed Sampling (DS), preferentially directing a fraction of emitted optical photons directly towards the detector by altering the isotropic model. The weight of each optical photon is appropriately modified to maintain simulation estimates unbiased. We use a Monte Carlo tool called fastDETECT2 (part of the hybridMANTIS open-source package) for optical transport, modified for VRT. Themore » weight of each photon is calculated as the ratio of original probability (no VRT) and the new probability for a particular direction. For our analysis of bias and efficiency, we use pulse height spectra, point response functions, and Swank factors. We obtain results for a variety of cases including analog (no VRT, isotropic distribution), and DS with 0.2 and 0.8 optical photons directed towards the sensor plane. We used 10,000, 25-keV primaries. Results: The Swank factor for all cases in our simplified model converged fast (within the first 100 primaries) to a stable value of 0.9. The root mean square error per pixel for DS VRT for the point response function between analog and VRT cases was approximately 5e-4. Conclusion: Our preliminary results suggest that DS VRT does not affect the estimate of the mean for the Swank factor. Our findings indicate that it may be possible to design VRTs for imaging detector simulations to increase computational efficiency without introducing bias.« less

  13. SU-E-T-525: Ionization Chamber Perturbation in Flattening Filter Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czarnecki, D; Voigts-Rhetz, P von; Zink, K

    2015-06-15

    Purpose: Changing the characteristic of a photon beam by mechanically removing the flattening filter may impact the dose response of ionization chambers. Thus, perturbation factors of cylindrical ionization chambers in conventional and flattening filter free photon beams were calculated by Monte Carlo simulations. Methods: The EGSnrc/BEAMnrc code system was used for all Monte Carlo calculations. BEAMnrc models of nine different linear accelerators with and without flattening filter were used to create realistic photon sources. Monte Carlo based calculations to determine the fluence perturbations due to the presens of the chambers components, the different material of the sensitive volume (air insteadmore » of water) as well as the volume effect were performed by the user code egs-chamber. Results: Stem, central electrode, wall, density and volume perturbation factors for linear accelerators with and without flattening filter were calculated as a function of the beam quality specifier TPR{sub 20/10}. A bias between the perturbation factors as a function of TPR{sub 20/10} for flattening filter free beams and conventional linear accelerators could not be observed for the perturbations caused by the components of the ionization chamber and the sensitive volume. Conclusion: The results indicate that the well-known small bias between the beam quality correction factor as a function of TPR20/10 for the flattening filter free and conventional linear accelerators is not caused by the geometry of the detector but rather by the material of the sensitive volume. This suggest that the bias for flattening filter free photon fields is only caused by the different material of the sensitive volume (air instead of water)« less

  14. Analysis of uncertainties in Monte Carlo simulated organ dose for chest CT

    NASA Astrophysics Data System (ADS)

    Muryn, John S.; Morgan, Ashraf G.; Segars, W. P.; Liptak, Chris L.; Dong, Frank F.; Primak, Andrew N.; Li, Xiang

    2015-03-01

    In Monte Carlo simulation of organ dose for a chest CT scan, many input parameters are required (e.g., half-value layer of the x-ray energy spectrum, effective beam width, and anatomical coverage of the scan). The input parameter values are provided by the manufacturer, measured experimentally, or determined based on typical clinical practices. The goal of this study was to assess the uncertainties in Monte Carlo simulated organ dose as a result of using input parameter values that deviate from the truth (clinical reality). Organ dose from a chest CT scan was simulated for a standard-size female phantom using a set of reference input parameter values (treated as the truth). To emulate the situation in which the input parameter values used by the researcher may deviate from the truth, additional simulations were performed in which errors were purposefully introduced into the input parameter values, the effects of which on organ dose per CTDIvol were analyzed. Our study showed that when errors in half value layer were within ± 0.5 mm Al, the errors in organ dose per CTDIvol were less than 6%. Errors in effective beam width of up to 3 mm had negligible effect (< 2.5%) on organ dose. In contrast, when the assumed anatomical center of the patient deviated from the true anatomical center by 5 cm, organ dose errors of up to 20% were introduced. Lastly, when the assumed extra scan length was longer by 4 cm than the true value, dose errors of up to 160% were found. The results answer the important question: to what level of accuracy each input parameter needs to be determined in order to obtain accurate organ dose results.

  15. Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.

    2002-09-11

    The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less

  16. SU-F-SPS-11: The Dosimetric Comparison of Truebeam 2.0 and Cyberknife M6 Treatment Plans for Brain SRS Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mabhouti, H; Sanli, E; Cebe, M

    Purpose: Brain stereotactic radiosurgery involves the use of precisely directed, single session radiation to create a desired radiobiologic response within the brain target with acceptable minimal effects on surrounding structures or tissues. In this study, the dosimetric comparison of Truebeam 2.0 and Cyberknife M6 treatment plans were made. Methods: For Truebeam 2.0 machine, treatment planning were done using 2 full arc VMAT technique with 6 FFF beam on the CT scan of Randophantom simulating the treatment of sterotactic treatments for one brain metastasis. The dose distribution were calculated using Eclipse treatment planning system with Acuros XB algorithm. The treatment planningmore » of the same target were also done for Cyberknife M6 machine with Multiplan treatment planning system using Monte Carlo algorithm. Using the same film batch, the net OD to dose calibration curve was obtained using both machine by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. Dose distribution were measured using EBT3 film dosimeter. The measured and calculated doses were compared. Results: The dose distribution in the target and 2 cm beyond the target edge were calculated on TPSs and measured using EBT3 film. For cyberknife plans, the gamma analysis passing rates between measured and calculated dose distributions were 99.2% and 96.7% for target and peripheral region of target respectively. For Truebeam plans, the gamma analysis passing rates were 99.1% and 95.5% for target and peripheral region of target respectively. Conclusion: Although, target dose distribution calculated accurately by Acuros XB and Monte Carlo algorithms, Monte carlo calculation algorithm predicts dose distribution around the peripheral region of target more accurately than Acuros algorithm.« less

  17. Perfusion CT of the Brain and Liver and of Lung Tumors: Use of Monte Carlo Simulation for Patient Dose Estimation for Examinations With a Cone-Beam 320-MDCT Scanner.

    PubMed

    Cros, Maria; Geleijns, Jacob; Joemai, Raoul M S; Salvadó, Marçal

    2016-01-01

    The purpose of this study was to estimate the patient dose from perfusion CT examinations of the brain, lung tumors, and the liver on a cone-beam 320-MDCT scanner using a Monte Carlo simulation and the recommendations of the International Commission on Radiological Protection (ICRP). A Monte Carlo simulation based on the Electron Gamma Shower Version 4 package code was used to calculate organ doses and the effective dose in the reference computational phantoms for an adult man and adult woman as published by the ICRP. Three perfusion CT acquisition protocols--brain, lung tumor, and liver perfusion--were evaluated. Additionally, dose assessments were performed for the skin and for the eye lens. Conversion factors were obtained to estimate effective doses and organ doses from the volume CT dose index and dose-length product. The sex-averaged effective doses were approximately 4 mSv for perfusion CT of the brain and were between 23 and 26 mSv for the perfusion CT body protocols. The eye lens dose from the brain perfusion CT examination was approximately 153 mGy. The sex-averaged peak entrance skin dose (ESD) was 255 mGy for the brain perfusion CT studies, 157 mGy for the lung tumor perfusion CT studies, and 172 mGy for the liver perfusion CT studies. The perfusion CT protocols for imaging the brain, lung tumors, and the liver performed on a 320-MDCT scanner yielded patient doses that are safely below the threshold doses for deterministic effects. The eye lens dose, peak ESD, and effective doses can be estimated for other clinical perfusion CT examinations from the conversion factors that were derived in this study.

  18. Measuring and monitoring KIPT Neutron Source Facility Reactivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Yan; Gohar, Yousry; Zhong, Zhaopeng

    2015-08-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on developing and constructing a neutron source facility at Kharkov, Ukraine. The facility consists of an accelerator-driven subcritical system. The accelerator has a 100 kW electron beam using 100 MeV electrons. The subcritical assembly has k eff less than 0.98. To ensure the safe operation of this neutron source facility, the reactivity of the subcritical core has to be accurately determined and continuously monitored. A technique which combines the area-ratio method and the flux-to-current ratio method is purposed to determine themore » reactivity of the KIPT subcritical assembly at various conditions. In particular, the area-ratio method can determine the absolute reactivity of the subcritical assembly in units of dollars by performing pulsed-neutron experiments. It provides reference reactivities for the flux-to-current ratio method to track and monitor the reactivity deviations from the reference state while the facility is at other operation modes. Monte Carlo simulations are performed to simulate both methods using the numerical model of the KIPT subcritical assembly. It is found that the reactivities obtained from both the area-ratio method and the flux-to-current ratio method are spatially dependent on the neutron detector locations and types. Numerical simulations also suggest optimal neutron detector locations to minimize the spatial effects in the flux-to-current ratio method. The spatial correction factors are calculated using Monte Carlo methods for both measuring methods at the selected neutron detector locations. Monte Carlo simulations are also performed to verify the accuracy of the flux-to-current ratio method in monitoring the reactivity swing during a fuel burnup cycle.« less

  19. An approach to design a 90Sr radioisotope thermoelectric generator using analytical and Monte Carlo methods with ANSYS, COMSOL, and MCNP.

    PubMed

    Khajepour, Abolhasan; Rahmani, Faezeh

    2017-01-01

    In this study, a 90 Sr radioisotope thermoelectric generator (RTG) with power of milliWatt was designed to operate in the determined temperature (300-312K). For this purpose, the combination of analytical and Monte Carlo methods with ANSYS and COMSOL software as well as the MCNP code was used. This designed RTG contains 90 Sr as a radioisotope heat source (RHS) and 127 coupled thermoelectric modules (TEMs) based on bismuth telluride. Kapton (2.45mm in thickness) and Cryotherm sheets (0.78mm in thickness) were selected as the thermal insulators of the RHS, as well as a stainless steel container was used as a generator chamber. The initial design of the RHS geometry was performed according to the amount of radioactive material (strontium titanate) as well as the heat transfer calculations and mechanical strength considerations. According to the Monte Carlo simulation performed by the MCNP code, approximately 0.35 kCi of 90 Sr is sufficient to generate heat power in the RHS. To determine the optimal design of the RTG, the distribution of temperature as well as the dissipated heat and input power to the module were calculated in different parts of the generator using the ANSYS software. Output voltage according to temperature distribution on TEM was calculated using COMSOL. Optimization of the dimension of the RHS and heat insulator was performed to adapt the average temperature of the hot plate of TEM to the determined hot temperature value. This designed RTG generates 8mW in power with an efficiency of 1%. This proposed approach of combination method can be used for the precise design of various types of RTGs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. A High-Resolution Spatially Explicit Monte-Carlo Simulation Approach to Commercial and Residential Electricity and Water Demand Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morton, April M; McManamay, Ryan A; Nagle, Nicholas N

    Abstract As urban areas continue to grow and evolve in a world of increasing environmental awareness, the need for high resolution spatially explicit estimates for energy and water demand has become increasingly important. Though current modeling efforts mark significant progress in the effort to better understand the spatial distribution of energy and water consumption, many are provided at a course spatial resolution or rely on techniques which depend on detailed region-specific data sources that are not publicly available for many parts of the U.S. Furthermore, many existing methods do not account for errors in input data sources and may thereforemore » not accurately reflect inherent uncertainties in model outputs. We propose an alternative and more flexible Monte-Carlo simulation approach to high-resolution residential and commercial electricity and water consumption modeling that relies primarily on publicly available data sources. The method s flexible data requirement and statistical framework ensure that the model is both applicable to a wide range of regions and reflective of uncertainties in model results. Key words: Energy Modeling, Water Modeling, Monte-Carlo Simulation, Uncertainty Quantification Acknowledgment This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y M; Bush, K; Han, B

    Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less

  2. Proton radiography and fluoroscopy of lung tumors: A Monte Carlo study using patient-specific 4DCT phantoms

    PubMed Central

    Han, Bin; Xu, X. George; Chen, George T. Y.

    2011-01-01

    Purpose: Monte Carlo methods are used to simulate and optimize a time-resolved proton range telescope (TRRT) in localization of intrafractional and interfractional motions of lung tumor and in quantification of proton range variations. Methods: The Monte Carlo N-Particle eXtended (MCNPX) code with a particle tracking feature was employed to evaluate the TRRT performance, especially in visualizing and quantifying proton range variations during respiration. Protons of 230 MeV were tracked one by one as they pass through position detectors, patient 4DCT phantom, and finally scintillator detectors that measured residual ranges. The energy response of the scintillator telescope was investigated. Mass density and elemental composition of tissues were defined for 4DCT data. Results: Proton water equivalent length (WEL) was deduced by a reconstruction algorithm that incorporates linear proton track and lateral spatial discrimination to improve the image quality. 4DCT data for three patients were used to visualize and measure tumor motion and WEL variations. The tumor trajectories extracted from the WEL map were found to be within ∼1 mm agreement with direct 4DCT measurement. Quantitative WEL variation studies showed that the proton radiograph is a good representation of WEL changes from entrance to distal of the target. Conclusions:MCNPX simulation results showed that TRRT can accurately track the motion of the tumor and detect the WEL variations. Image quality was optimized by choosing proton energy, testing parameters of image reconstruction algorithm, and comparing to ground truth 4DCT. The future study will demonstrate the feasibility of using the time resolved proton radiography as an imaging tool for proton treatments of lung tumors. PMID:21626923

  3. Quantitative microbial risk assessment for Escherichia coli O157:H7, salmonella, and Listeria monocytogenes in leafy green vegetables consumed at salad bars.

    PubMed

    Franz, E; Tromp, S O; Rijgersberg, H; van der Fels-Klerx, H J

    2010-02-01

    Fresh vegetables are increasingly recognized as a source of foodborne outbreaks in many parts of the world. The purpose of this study was to conduct a quantitative microbial risk assessment for Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes infection from consumption of leafy green vegetables in salad from salad bars in The Netherlands. Pathogen growth was modeled in Aladin (Agro Logistics Analysis and Design Instrument) using time-temperature profiles in the chilled supply chain and one particular restaurant with a salad bar. A second-order Monte Carlo risk assessment model was constructed (using @Risk) to estimate the public health effects. The temperature in the studied cold chain was well controlled below 5 degrees C. Growth of E. coli O157:H7 and Salmonella was minimal (17 and 15%, respectively). Growth of L. monocytogenes was considerably greater (194%). Based on first-order Monte Carlo simulations, the average number of cases per year in The Netherlands associated the consumption leafy greens in salads from salad bars was 166, 187, and 0.3 for E. coli O157:H7, Salmonella, and L. monocytogenes, respectively. The ranges of the average number of annual cases as estimated by second-order Monte Carlo simulation (with prevalence and number of visitors as uncertain variables) were 42 to 551 for E. coli O157:H7, 81 to 281 for Salmonella, and 0.1 to 0.9 for L. monocytogenes. This study included an integration of modeling pathogen growth in the supply chain of fresh leafy vegetables destined for restaurant salad bars using software designed to model and design logistics and modeling the public health effects using probabilistic risk assessment software.

  4. Dynamic Event Tree advancements and control logic improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less

  5. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  6. The effect of tandem-ovoid titanium applicator on points A, B, bladder, and rectum doses in gynecological brachytherapy using 192Ir

    PubMed Central

    Sadeghi, Mohammad Hosein; Mehdizadeh, Amir; Faghihi, Reza; Moharramzadeh, Vahed; Meigooni, Ali Soleimani

    2018-01-01

    Purpose The dosimetry procedure by simple superposition accounts only for the self-shielding of the source and does not take into account the attenuation of photons by the applicators. The purpose of this investigation is an estimation of the effects of the tandem and ovoid applicator on dose distribution inside the phantom by MCNP5 Monte Carlo simulations. Material and methods In this study, the superposition method is used for obtaining the dose distribution in the phantom without using the applicator for a typical gynecological brachytherapy (superposition-1). Then, the sources are simulated inside the tandem and ovoid applicator to identify the effect of applicator attenuation (superposition-2), and the dose at points A, B, bladder, and rectum were compared with the results of superposition. The exact dwell positions, times of the source, and positions of the dosimetry points were determined in images of a patient and treatment data of an adult woman patient from a cancer center. The MCNP5 Monte Carlo (MC) code was used for simulation of the phantoms, applicators, and the sources. Results The results of this study showed no significant differences between the results of superposition method and the MC simulations for different dosimetry points. The difference in all important dosimetry points was found to be less than 5%. Conclusions According to the results, applicator attenuation has no significant effect on the calculated points dose, the superposition method, adding the dose of each source obtained by the MC simulation, can estimate the dose to points A, B, bladder, and rectum with good accuracy. PMID:29619061

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Q; Read, P

    Purpose: Multiple error pathways can lead to delivery errors during the treatment course that cannot be caught with pre-treatment QA. While in vivo solutions are being developed for linacs, no such solution exists for tomotherapy. The purpose of this study is to develop a near real-time system for tomotherapy that can monitor the delivery and dose accumulation process during the treatment-delivery, which enable the user to assess the impact of delivery variations and/or errors and to interrupt the treatment if necessary. Methods: A program running on a tomotherapy planning station fetches the raw DAS data during treatment. Exit detector datamore » is extracted as well as output, gantry angle, and other machine parameters. For each sample, the MLC open-close state is determined. The delivered plan is compared with the original plan via a Monte Carlo dose engine which transports fluence deviations from a pre-treatment Monte Carlo run. A report containing the difference in fluence, dose and DVH statistics is created in html format. This process is repeated until the treatment is completed. Results: Since we only need to compute the dose for the difference in fluence for a few projections each time, dose with 2% statistical uncertainty can be computed in less than 1 second on a 4-core cpu. However, the current bottleneck in this near real-time system is the repeated fetching and processing the growing DAS data file throughout the delivery. The frame rate drops from 10Hz at the beginning of treatment to 5Hz after 3 minutes and to 2Hz after 10 minutes. Conclusion: A during-treatment delivery monitor system has been built to monitor tomotherapy treatments. The system improves patient safety by allowing operators to assess the delivery variations and errors during treatment delivery and adopt appropriate actions.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kieselmann, J; Bartzsch, S; Oelfke, U

    Purpose: Microbeam Radiation Therapy is a preclinical method in radiation oncology that modulates radiation fields on a micrometre scale. Dose calculation is challenging due to arising dose gradients and therapeutically important dose ranges. Monte Carlo (MC) simulations, often used as gold standard, are computationally expensive and hence too slow for the optimisation of treatment parameters in future clinical applications. On the other hand, conventional kernel based dose calculation leads to inaccurate results close to material interfaces. The purpose of this work is to overcome these inaccuracies while keeping computation times low. Methods: A point kernel superposition algorithm is modified tomore » account for tissue inhomogeneities. Instead of conventional ray tracing approaches, methods from differential geometry are applied and the space around the primary photon interaction is locally warped. The performance of this approach is compared to MC simulations and a simple convolution algorithm (CA) for two different phantoms and photon spectra. Results: While peak doses of all dose calculation methods agreed within less than 4% deviations, the proposed approach surpassed a simple convolution algorithm in accuracy by a factor of up to 3 in the scatter dose. In a treatment geometry similar to possible future clinical situations differences between Monte Carlo and the differential geometry algorithm were less than 3%. At the same time the calculation time did not exceed 15 minutes. Conclusion: With the developed method it was possible to improve the dose calculation based on the CA method with respect to accuracy especially at sharp tissue boundaries. While the calculation is more extensive than for the CA method and depends on field size, the typical calculation time for a 20×20 mm{sup 2} field on a 3.4 GHz and 8 GByte RAM processor remained below 15 minutes. Parallelisation and optimisation of the algorithm could lead to further significant calculation time reductions.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Saenz, D

    Purpose: Stereotactic radiosurgery (SRS) outcomes are related to the delivered dose to the target and to surrounding tissue. We have commissioned a Monte Carlo based dose calculation algorithm to recalculated the delivered dose planned using pencil beam calculation dose engine. Methods: Twenty consecutive previously treated patients have been selected for this study. All plans were generated using the iPlan treatment planning system (TPS) and calculated using the pencil beam algorithm. Each patient plan consisted of 1 to 3 targets and treated using dynamically conformal arcs or intensity modulated beams. Multi-target treatments were delivered using multiple isocenters, one for each target.more » These plans were recalculated for the purpose of this study using a single isocenter. The CT image sets along with the plan, doses and structures were DICOM exported to Monaco TPS and the dose was recalculated using the same voxel resolution and monitor units. Benchmark data was also generated prior to patient calculations to assess the accuracy of the two TPS against measurements using a micro ionization chamber in solid water. Results: Good agreement, within −0.4% for Monaco and +2.2% for iPlan were observed for measurements in water phantom. Doses in patient geometry revealed up to 9.6% differences for single target plans and 9.3% for multiple-target-multiple-isocenter plans. The average dose differences for multi-target-single-isocenter plans were approximately 1.4%. Similar differences were observed for the OARs and integral dose. Conclusion: Accuracy of the beam is crucial for the dose calculation especially in the case of small fields such as those used in SRS treatments. A superior dose calculation algorithm such as Monte Carlo, with properly commissioned beam models, which is unaffected by the lack of electronic equilibrium should be preferred for the calculation of small fields to improve accuracy.« less

  10. Dual-Modality, Dual-Functional Nanoprobes for Cellular and Molecular Imaging

    PubMed Central

    Menon, Jyothi U.; Gulaka, Praveen K.; McKay, Madalyn A.; Geethanath, Sairam; Liu, Li; Kodibagkar, Vikram D.

    2012-01-01

    An emerging need for evaluation of promising cellular therapies is a non-invasive method to image the movement and health of cells following transplantation. However, the use of a single modality to serve this purpose may not be advantageous as it may convey inaccurate or insufficient information. Multi-modal imaging strategies are becoming more popular for in vivo cellular and molecular imaging because of their improved sensitivity, higher resolution and structural/functional visualization. This study aims at formulating Nile Red doped hexamethyldisiloxane (HMDSO) nanoemulsions as dual modality (Magnetic Resonance Imaging/Fluorescence), dual-functional (oximetry/detection) nanoprobes for cellular and molecular imaging. HMDSO nanoprobes were prepared using a HS15-lecithin combination as surfactant and showed an average radius of 71±39 nm by dynamic light scattering and in vitro particle stability in human plasma over 24 hrs. They were found to readily localize in the cytosol of MCF7-GFP cells within 18 minutes of incubation. As proof of principle, these nanoprobes were successfully used for fluorescence imaging and for measuring pO2 changes in cells by magnetic resonance imaging, in vitro, thus showing potential for in vivo applications. PMID:23382776

  11. Harnessing Drug Resistance: Using ABC Transporter Proteins To Target Cancer Cells

    PubMed Central

    Leitner, Heather M.; Kachadourian, Remy; Day, Brian J.

    2007-01-01

    The ATP-binding cassette (ABC) class of proteins is one of the most functionally diverse transporter families found in biological systems. Although the abundance of ABC proteins varies between species, they are highly conserved in sequence and often demonstrate similar functions across prokaryotic and eukaryotic organisms. Beginning with a brief summary of the events leading to our present day knowledge of ABC transporters, the purpose of this review is to discuss the potential for utilizing ABC transporters as a means for cellular glutathione (GSH) modulation. GSH is one of the most abundant thiol antioxidants in cells. It is involved in cellular division, protein and DNA synthesis, maintenance of cellular redox status and xenobiotic metabolism. Cellular GSH levels are often altered in many disease states including cancer. Over the past two decades there has been considerable emphasis on methods to sensitize cancer cells to chemotherapeutics and ionization radiation therapy by GSH depletion. We contend that ABC transporters, particularly multi-drug resistant proteins (MRPs), may be used as therapeutic targets for applications aimed at modulation of GSH levels. This review will emphasize MRP-mediated modulation of intracellular GSH levels as a potential alternative and adjunctive approach for cancer therapy. PMID:17585883

  12. Two-photon excited autofluorescence imaging of freshly isolated frog retinas.

    PubMed

    Lu, Rong-Wen; Li, Yi-Chao; Ye, Tong; Strang, Christianne; Keyser, Kent; Curcio, Christine A; Yao, Xin-Cheng

    2011-06-01

    The purpose of this study was to investigate cellular sources of autofluorescence signals in freshly isolated frog (Rana pipiens) retinas. Equipped with an ultrafast laser, a laser scanning two-photon excitation fluorescence microscope was employed for sub-cellular resolution examination of both sliced and flat-mounted retinas. Two-photon imaging of retinal slices revealed autofluorescence signals over multiple functional layers, including the photoreceptor layer (PRL), outer nuclear layer (ONL), outer plexiform layer (OPL), inner nuclear layer (INL), inner plexiform layer (IPL), and ganglion cell layer (GCL). Using flat-mounted retinas, depth-resolved imaging of individual retinal layers further confirmed multiple sources of autofluorescence signals. Cellular structures were clearly observed at the PRL, ONL, INL, and GCL. At the PRL, the autofluorescence was dominantly recorded from the intracellular compartment of the photoreceptors; while mixed intracellular and extracellular autofluorescence signals were observed at the ONL, INL, and GCL. High resolution autofluorescence imaging clearly revealed mosaic organization of rod and cone photoreceptors; and sub-cellular bright autofluorescence spots, which might relate to connecting cilium, was observed in the cone photoreceptors only. Moreover, single-cone and double-cone outer segments could be directly differentiated.

  13. Monte Carlo reference data sets for imaging research: Executive summary of the report of AAPM Research Committee Task Group 195.

    PubMed

    Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C

    2015-10-01

    The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.

  14. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Fada; Peeler, Christopher; Taleei, Reza

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking stepmore » size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to determine fluctuations in energy deposition along the tracking step in GEANT 4. The incorrect LET{sub d} values lead to substantial differences in the calculated RBE. Conclusions: When the GEANT 4 particle tracking method is used to calculate the average LET values within targets with a small step limit, such as smaller than 500 μm, the authors recommend the use of LET{sub t} in the dose plateau region and LET{sub d} around the Bragg peak. For a large step limit, i.e., 500 μm, LET{sub d} is recommended along the whole Bragg curve. The transition point depends on beam parameters and can be found by determining the location where the gradient of the ratio of LET{sub d} and LET{sub t} becomes positive.« less

  15. Cellular automata with object-oriented features for parallel molecular network modeling.

    PubMed

    Zhu, Hao; Wu, Yinghui; Huang, Sui; Sun, Yan; Dhar, Pawan

    2005-06-01

    Cellular automata are an important modeling paradigm for studying the dynamics of large, parallel systems composed of multiple, interacting components. However, to model biological systems, cellular automata need to be extended beyond the large-scale parallelism and intensive communication in order to capture two fundamental properties characteristic of complex biological systems: hierarchy and heterogeneity. This paper proposes extensions to a cellular automata language, Cellang, to meet this purpose. The extended language, with object-oriented features, can be used to describe the structure and activity of parallel molecular networks within cells. Capabilities of this new programming language include object structure to define molecular programs within a cell, floating-point data type and mathematical functions to perform quantitative computation, message passing capability to describe molecular interactions, as well as new operators, statements, and built-in functions. We discuss relevant programming issues of these features, including the object-oriented description of molecular interactions with molecule encapsulation, message passing, and the description of heterogeneity and anisotropy at the cell and molecule levels. By enabling the integration of modeling at the molecular level with system behavior at cell, tissue, organ, or even organism levels, the program will help improve our understanding of how complex and dynamic biological activities are generated and controlled by parallel functioning of molecular networks. Index Terms-Cellular automata, modeling, molecular network, object-oriented.

  16. BROMOC suite: Monte Carlo/Brownian dynamics suite for studies of ion permeation and DNA transport in biological and artificial pores with effective potentials.

    PubMed

    De Biase, Pablo M; Markosyan, Suren; Noskov, Sergei

    2015-02-05

    The transport of ions and solutes by biological pores is central for cellular processes and has a variety of applications in modern biotechnology. The time scale involved in the polymer transport across a nanopore is beyond the accessibility of conventional MD simulations. Moreover, experimental studies lack sufficient resolution to provide details on the molecular underpinning of the transport mechanisms. BROMOC, the code presented herein, performs Brownian dynamics simulations, both serial and parallel, up to several milliseconds long. BROMOC can be used to model large biological systems. IMC-MACRO software allows for the development of effective potentials for solute-ion interactions based on radial distribution function from all-atom MD. BROMOC Suite also provides a versatile set of tools to do a wide variety of preprocessing and postsimulation analysis. We illustrate a potential application with ion and ssDNA transport in MspA nanopore. © 2014 Wiley Periodicals, Inc.

  17. Molecular Simulations of Sequence-Specific Association of Transmembrane Proteins in Lipid Bilayers

    NASA Astrophysics Data System (ADS)

    Doxastakis, Manolis; Prakash, Anupam; Janosi, Lorant

    2011-03-01

    Association of membrane proteins is central in material and information flow across the cellular membranes. Amino-acid sequence and the membrane environment are two critical factors controlling association, however, quantitative knowledge on such contributions is limited. In this work, we study the dimerization of helices in lipid bilayers using extensive parallel Monte Carlo simulations with recently developed algorithms. The dimerization of Glycophorin A is examined employing a coarse-grain model that retains a level of amino-acid specificity, in three different phospholipid bilayers. Association is driven by a balance of protein-protein and lipid-induced interactions with the latter playing a major role at short separations. Following a different approach, the effect of amino-acid sequence is studied using the four transmembrane domains of the epidermal growth factor receptor family in identical lipid environments. Detailed characterization of dimer formation and estimates of the free energy of association reveal that these helices present significant affinity to self-associate with certain dimers forming non-specific interfaces.

  18. Strongly correlated superconductivity and quantum criticality

    NASA Astrophysics Data System (ADS)

    Tremblay, A.-M. S.

    Doped Mott insulators and doped charge-transfer insulators describe classes of materials that can exhibit unconventional superconducting ground states. Examples include the cuprates and the layered organic superconductors of the BEDT family. I present results obtained from plaquette cellular dynamical mean-field theory. Continuous-time quantum Monte Carlo evaluation of the hybridization expansion allows one to study the models in the large interaction limit where quasiparticles can disappear. The normal state which is unstable to the superconducting state exhibits a first-order transition between a pseudogap and a correlated metal phase. That transition is the finite-doping extension of the metal-insulator transition obtained at half-filling. This transition serves as an organizing principle for the normal and superconducting states of both cuprates and doped organic superconductors. In the less strongly correlated limit, these methods also describe the more conventional case where the superconducting dome surrounds an antiferromagnetic quantum critical point. Sponsored by NSERC RGPIN-2014-04584, CIFAR, Research Chair in the Theory of Quantum Materials.

  19. The Fysics of Filopodia (or The Physics of Philopodia)

    NASA Astrophysics Data System (ADS)

    Schwarz, Jen; Gopinathan, Ajay; Lee, Kun-Chun; Liu, Andrea; Yang, Louise

    2006-03-01

    Cell motility is driven by the dynamic reorganization of the cellular cytoskeleton which is composed of actin. Monomeric actin assembles into filaments that grow, shrink, branch and bundle. Branching generates new filaments that form a mesh-like structure that protrudes outward allowing the cell to move somewhere. But how does it know where to move? It has been proposed that filopodia serve as scouts for the cell. Filopodia are bundles of actin filaments that extend out ahead of the rest of the cell to probe its upcoming environment. Recent in vitro experiments [Vignjevic et al., J. Ce ll Bio. 160, 951 (2003)] determine the minimal ingredients required for such a process. We model these experiments analytically and via Monte Carlo simulations to estimate the typical bundle size and length. We also estimate the size of the mesh-like structure from which the filopodia emerge and explain the observed nonmonotonicity of this size as a function of capping protein concentration, which inhibits filament growth.

  20. Quantum magnetic phase transition in square-octagon lattice.

    PubMed

    Bao, An; Tao, Hong-Shuai; Liu, Hai-Di; Zhang, XiaoZhong; Liu, Wu-Ming

    2014-11-05

    Quantum magnetic phase transition in square-octagon lattice was investigated by cellular dynamical mean field theory combining with continuous time quantum Monte Carlo algorithm. Based on the systematic calculation on the density of states, the double occupancy and the Fermi surface evolution of square-octagon lattice, we presented the phase diagrams of this splendid many particle system. The competition between the temperature and the on-site repulsive interaction in the isotropic square-octagon lattice has shown that both antiferromagnetic and paramagnetic order can be found not only in the metal phase, but also in the insulating phase. Antiferromagnetic metal phase disappeared in the phase diagram that consists of the anisotropic parameter λ and the on-site repulsive interaction U while the other phases still can be detected at T = 0.17. The results found in this work may contribute to understand well the properties of some consuming systems that have square-octagon structure, quasi square-octagon structure, such as ZnO.

Top